TikTok’s new community guidelines officially went into effect on Friday, May 17. Last month, TikTok announced that it would change the eligibility guidelines of its For You feeds to limit hate speech and health misinformation. Now, those guidelines are live, and they should improve your scrolling experience (or at least cut down on the number of almond mom and toxic gym bro videos you see).
Content flagged as promoting disordered eating or conspiracy theories is now ineligible to be promoted on For You feeds. TikTok also said that creators who continually share this kind of content will be made harder to find in search, according to last month’s press release. A new Creator Code of Conduct also makes it clear that influencers taking part in the platform’s monetization and reward programs will be held to a higher standard of behavior than folks outside of those programs.
The future of TikTok in the US remains uncertain, and the company faces renewed legal challenges. That hasn’t stopped the social media platform from introducing new features and updates. Most recently, TikTok announced it would be implementing labels on AI-generated content. TikTok also reached a new agreement with Universal Music Group, including new artist protections against AI and bringing back TikTok sounds from popular artists like Billie Eilish and SZA.
Any time a social media platform updates its community guidelines or eligibility standards, there are a lot of questions about what’s changing and what content, if any, will be affected. Below, I break down the main points and what you need to know about the new guidelines and eligibility standards.
TikTok adds safeguards against hate speech, health misinformation
Community guidelines are rules that lay out what content is and isn’t allowed on the platform. On TikTok, there’s an extra layer to these guidelines that outlines what content is eligible to be recommended across For You feeds.
What’s new in TikTok’s community guidelines are two new standards concerning hate speech and health misinformation. The new standards state that content flagged as promoting these things, like videos about disordered eating and conspiracy theories, won’t be eligible to appear in TikTok users’ For You feeds.
TikTok gives a few examples of what kind of content they are making ineligible. For health misinformation, content showing or describing “potentially harmful weight management behaviors” is off-limits. Videos from people claiming to be dietitians with dubious credentials promoting intermittent fasting come to mind. Also ineligible is content that promotes weight loss products, rapid weight loss exercise regimens and cosmetic surgery without proper risk warnings.
Under the misinformation category, TikTok states that “conspiracy theories that are unfounded and claim that certain events or situations are carried out by covert or powerful groups, such as ‘the government’ or a ‘secret society'” will be limited from sharing and possibly removed. Other content that’s ineligible under this standard are posts that misrepresent results from scientific studies — like a video claiming a study found vaccines are bad when the study actually found the opposite — and use of repurposed media, like using footage of a concert crowd and claiming it’s from a political protest.
By adding these new standards of eligibility, TikTok is trying to help you avoid falling into potentially dangerous rabbit holes. The algorithm that creates your For You page is powerful, so when you interact with a certain kind of content, TikTok sends you more of it. Limiting hate speech and health misinformation on your For You page tries to cut you off from cycles of misinformation before they even start. Now that these guidelines are in effect, we’ll have to see just how effective TikTok is at actually doing this.
Updates to TikTok’s warning strike system
In addition to updating its community guidelines, TikTok is rolling out a new feature called Account Check for you to verify your account’s standing with TikTok. TikTok has been using a strike system for the past year, where each violation results in a strike, and once you meet a certain number of strikes (depending on the kind of violation), your account is banned.
Account Check should provide more clarity about which, if any, of your videos TikTok has flagged as violating its policies. The feature audits your account and your last 30 posts to highlight any content that’s been flagged as violating TikTok’s guidelines. You’ll also be able to see if you’ve been restricted from using certain features like direct messaging, live streams and commenting. You will continue to be able to appeal TikTok’s decisions.
For more, check out how to use Meta AI on Instagram and what Threads users should know about the fediverse.
+ There are no comments
Add yours