When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.

  • RQG
    link
    fedilink
    English
    6610 months ago

    Having largely undisclosed and privately owned platforms and algorithms dictate and decide our cultural exchange, spread of news, topics of discourse and other societally important interactions is such a horrible idea. I wish this was more obvious to the public so governments would end this. It divides sociaties, poisons public discourse and skews it with racist biases and towards hatred.