• blazera@kbin.social
    link
    fedilink
    arrow-up
    1
    arrow-down
    4
    ·
    1 year ago

    because we’ve gone in a circle of me asking if the site we are on right now is doing anything better with regards to this problematic material, since folks seem to care about Twitters failure to address it themselves. You respond that it’s not about their lack of addressing the material, but they’re lack of a response to the regulatory inquiry. I point out that they did respond, and your response is that oh they actually need to have a good answer of how they are addressing the material. Which is the same premise as the article and what my first comment was about. It’s hypocrisy, because the standard isnt being applied to the fediverse, no one is up in arms about our lack of automatic detection of problematic material or surveillance of private messaging. Because we care about privacy when we’re not being blinded by well intentioned Musk hate.

    • squiblet@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 year ago

      I posted from the article that they didn’t respond to several questions:

      X’s noncompliance was more serious, the regulator said, including failure to answer questions about how long it took to respond to reports of child abuse, steps it took to detect child abuse in livestreams and its numbers of content moderation, safety and public policy staff.

      I speculated that probably they also need adequate responses, but that’s not what the article or the fine is about.

      If one of the individual sites in the Fediverse was asked by Australian regulators, I bet they’d respond fully. It’s not quite the same situation as Twitter, either - none of these sites are large enough to require many staff members, and don’t have their own live streaming platform.