• yeahiknow3@lemmings.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    edit-2
    6 days ago

    Where’s the critique coming from? The Wiki seems to have nothing but positive things to say. Might be an error. Ironic.

    Scientific studies[23] using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,[8] with NewsGuard[9] and with BuzzFeed journalists.[10] When MBFC factualness ratings of ‘mostly factual’ or higher were compared to an independent fact checking dataset’s ‘verified’ and ‘suspicious’ news sources, the two datasets showed “almost perfect” inter-rater reliability.

    • PhilipTheBucket@ponder.cat
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      6 days ago

      It’s from https://en.m.wikipedia.org/wiki/Wikipedia:Reliable_sources/Perennial_sources#Media_Bias/Fact_Check

      There is consensus that Media Bias/Fact Check is generally unreliable, as it is self-published. Editors have questioned the methodology of the site’s ratings.

      I think the perennial sources list gets a lot more attention than the wiki page for MBFC itself, and probably the standards for judging it reliable are higher.

      • yeahiknow3@lemmings.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        edit-2
        6 days ago

        I read that. My best guess is that this is either an error that hasn’t been updated in light of empirical studies corroborating MBFC’s reliability, or more likely any self-published list gets the “unreliable” sticker automatically.

        Also, making claims about “a consensus” without sourcing these claims is mighty suspicious. Disappointed.

        • goferking0@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          They’re saying the parts mbfc uses other data from is fine, like the fact checking matching others as they all use the same source. But the rest like bias can’t be trusted as it’s just their own unscientific methods.

          • PhilipTheBucket@ponder.cat
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            6 days ago

            They’re not saying that. How did you summarize 23 words using 39 words, and get the summary wrong?

            They’re saying that there is no external professional vouching for MBFC’s conclusions, which is their usual gold standard for things being “reliable.” And that, on top of that, people within Wikipedia have specifically pointed out flaws with how MBFC does things, without any of the qualifications and categories that you added.

              • PhilipTheBucket@ponder.cat
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                6 days ago

                Got it, that does make sense. You should know, though, that Wikipedia on the content side is a different thing from Wikipedia on the talk page side.

                People can have nice things to say about a source in their Wikipedia page about the source, on the content side, while there’s still a consensus on the talk page side that the source is unreliable and shouldn’t be used for sourcing claims about other matters on other Wikipedia pages. The big table that I and someone else linked to are good summaries of the consensus on the talk page side, which is what’s most relevant here.

                • goferking0@lemmy.sdf.org
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 days ago

                  A 2018 year-in-review and prospective on fact-checking from the Poynter Institute (which develops PolitiFact[27]) noted a proliferation of credibility score projects, including Media/Bias Fact Check, writing that “While these projects are, in theory, a good addition to the efforts combating misinformation, they have the potential to misfire,” and stating that “Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific.”[6] Also in 2018, a writer in the Columbia Journalism Review described Media Bias/Fact Check as “an armchair media analysis”[28] and characterized their assessments as “subjective assessments [that] leave room for human biases, or even simple inconsistencies, to creep in”.[29] A study published in Scientific Reports wrote: “While [Media Bias/Fact Check’s] credibility is sometimes questioned, it has been regarded as accurate enough to be used as ground-truth for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems.”[19]

                  • PhilipTheBucket@ponder.cat
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    6 days ago

                    Yes! You have successfully found the content page. If only someone had kindly explained to you that there’s a whole other side of Wikipedia which is more relevant to this discussion. It would have been nice for you to be able to have a whole patient explanation about how it all works.

        • PhilipTheBucket@ponder.cat
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          6 days ago

          Tell me you have no idea how Wikipedia works, without telling me you have no idea.

          You’re putting trust in the stuff that doesn’t mean very much, and "best guess"ing that the stuff that is dependable is not.