New court documents reveal that Russia is keeping a very, very long list of influencers to spread its propaganda.

The Russian disinformation plot revealed in a Justice Department indictment this week may just be the tip of the iceberg, according to newly unsealed court documents.

On Wednesday, the DOJ announced it would seize 32 internet domains linked to a larger Kremlin scheme to promote disinformation and influence the 2024 election. The Russian campaign, known as Doppelganger, uses AI-generated content to create “fake news” boosted through social media with the aim of electing Donald Trump.

Of particular note, the documents released Wednesday included an affidavit that noted a Russian company is keeping a list of more than 2,800 influencers world wide, about one-fifth of whom are based in the United States, to monitor and potentially groom to spread Russian propaganda. The affidavit does not mention the full list of influencers, but is still a terrifying indicator of how deep the Russian plot to interfere in U.S. politics really goes.

  • ochi_chernye@startrek.website
    link
    fedilink
    arrow-up
    11
    ·
    2 months ago

    I just think calling people bots and shills has no place in honest discourse and the brushstroke always tends to get bigger and bigger.

    Bots and shills have no place in honest discourse, but they obviously exist. Should we pretend they don’t—assume everyone is arguing in good faith, regardless of how blatantly dishonest and inconsistent they are? What would you suggest?

    I don’t disagree that there’s a slippery slope problem; there’s no shortage of fringe internet echo chambers that dismiss all dissenting opinions as coming from npc’s, cia shills, shitlibs, bloodmouths, breeders, <insert dehumanizing label>, etc.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Should we pretend they don’t—assume everyone is arguing in good faith

      It’s a though problem but essentially yes. We should only ban because of content, so anything pro-putin would get the hammer but that comes with it’s own problems. It’s hard to draw the line. Is being pro-isreal an acceptable stance (not morally, thats obvious, but ban wise)? What about being pro-gasoline cars? I’ve been tempted many times to assume people bashing EVs are oil industry shills but it’s really just people that fell for their propaganda and not someone that is actively participating in it. For the most part, downvotes do their job but everyone knows those can easily be manipulated as well.

      If the news was about pro-AI bots floating around, I would probably be accused of being one because I’m very outspoken about it when it’s a dissenting opinion on lemmy.

      I just don’t think it’s a good standard to keep. I don’t have a solution but I think trying to call out people on it will just end up in people calling each other that when ever an argument goes badly. In the end, I view it as a form of rhetoric.

      • ochi_chernye@startrek.website
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        2 months ago

        I guess what we want to do is to cultivate a community where people—and especially bots—will have a hard time engaging dishonestly. Having said that, I’m no closer to knowing how to do it. The struggle with misinformation disinformation seems like an arms race where the bad actors will always have the advantage.