Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Given the research above, we assess a strong possibility that content on TikTok is either amplified or suppressed based on its alignment with the interests of the Chinese Government.
…but the data they present doesn’t prove that statement at all.
The report authors describe their data collection methodology at the top of Page 5 of the report. They state that they’re using each platform’s advertising management system to count the total number of posts/entries that feature a given hash tag, and comparing the counts on one platform to the counts on the other.
Think about that for a second. Those numbers are just aggregates of tagged user posts. To assert that ByteDance is “amplifying” or “suppressing” a given topic, the data would need to show evidence of raw posts in a given category being edited or deleted en mass, or that perhaps the content feeds and searches that each platform provides to its users are being modified to hide or promote posts aligned with specific subjects. The data doesn’t address any of that.
What the data DOES show is how many posts on each platform align with given topics that advertisers have access to. Taken at face value, this data can tell us a lot of interesting things about the users of these particular platforms. For example, TikTok seems to be a lot more into Shakira than Harry Styles. That’s interesting, I guess. Also, Instagram users are making more posts about Uyghurs than TikTok users. That’s also interesting, but that’s not necessarily evidence that ByteDance is suppressing content. What seems more likely is that people who give enough of a shit about Uyghurs to write posts about it aren’t using TikTok.
So ok, fine, let’s get into some deep-data-fuckery hypotheticals:
Could TikTok posts pertaining to topics that the Chinese government has expressed opinions about be being edited or deleted? Maybe. That should be easy enough to collect data on and test.
Could the aggregation of TikTok posts for the advertising/marketing systems be deliberately fudging the numbers by under-counting posts for some topics and/or over-counting for others? Maybe. The data doesn’t prove it. But… why? The function of those advertising systems is to allow marketers to buy ads and figure out costs. Lying about those numbers would mean ByteDance was scamming advertisers. Admittedly, that would be quite a scandal if it were happening, but that’s nowhere near the same thing as the report’s conclusion.
The report’s conclusion is a full-throated statement that ByteDance is tipping the scales in terms of what content is being served to TikTok’s users. This might actually be happening, and it’s absolutely worth investigating, but the evidence in this report does not back up that claim.
Finally, a pro-tip: if you’re skimming a research report and spot the authors misusing the phrase “begging the question”, it’s time to crank up your bullshit detector to maximum.
I think it’s a fair conclusion, and the conclusion is caveated saying more research is needed.
Conclusion: Substantial Differences in Hashtag Ratios Raise
Concerns about TikTok’s Impartiality.
Given the research above, we assess a strong possibility that content on TikTok is either
amplified or suppressed based on its alignment with the interests of the Chinese Government.
Future research should aim towards a more comprehensive analysis to determine the potential
influence of TikTok on popular public narratives. This research should determine if and how
TikTok might be utilized for furthering national/regional or international objectives of the Chinese
Government.
Should such research determine that TikTok users exhibit attitudes and assessments of world
events aligned with the information distortions that we have discovered, democracies will need
to consider appropriate counter-measures to better protect information integrity and mitigate
potential real-world impacts.
( sigh )
Everybody, shut the fuck up.
I read the NCRI-Rutgers report in question. You can, too.
The report’s conclusion states…
…but the data they present doesn’t prove that statement at all.
The report authors describe their data collection methodology at the top of Page 5 of the report. They state that they’re using each platform’s advertising management system to count the total number of posts/entries that feature a given hash tag, and comparing the counts on one platform to the counts on the other.
Think about that for a second. Those numbers are just aggregates of tagged user posts. To assert that ByteDance is “amplifying” or “suppressing” a given topic, the data would need to show evidence of raw posts in a given category being edited or deleted en mass, or that perhaps the content feeds and searches that each platform provides to its users are being modified to hide or promote posts aligned with specific subjects. The data doesn’t address any of that.
What the data DOES show is how many posts on each platform align with given topics that advertisers have access to. Taken at face value, this data can tell us a lot of interesting things about the users of these particular platforms. For example, TikTok seems to be a lot more into Shakira than Harry Styles. That’s interesting, I guess. Also, Instagram users are making more posts about Uyghurs than TikTok users. That’s also interesting, but that’s not necessarily evidence that ByteDance is suppressing content. What seems more likely is that people who give enough of a shit about Uyghurs to write posts about it aren’t using TikTok.
So ok, fine, let’s get into some deep-data-fuckery hypotheticals:
Could TikTok posts pertaining to topics that the Chinese government has expressed opinions about be being edited or deleted? Maybe. That should be easy enough to collect data on and test.
Could the aggregation of TikTok posts for the advertising/marketing systems be deliberately fudging the numbers by under-counting posts for some topics and/or over-counting for others? Maybe. The data doesn’t prove it. But… why? The function of those advertising systems is to allow marketers to buy ads and figure out costs. Lying about those numbers would mean ByteDance was scamming advertisers. Admittedly, that would be quite a scandal if it were happening, but that’s nowhere near the same thing as the report’s conclusion.
The report’s conclusion is a full-throated statement that ByteDance is tipping the scales in terms of what content is being served to TikTok’s users. This might actually be happening, and it’s absolutely worth investigating, but the evidence in this report does not back up that claim.
Finally, a pro-tip: if you’re skimming a research report and spot the authors misusing the phrase “begging the question”, it’s time to crank up your bullshit detector to maximum.
I think it’s a fair conclusion, and the conclusion is caveated saying more research is needed.
Some pretty good sleuthing there