TikTok finally moves against disordered eating, removes 91 million videos
TikTok has announced that it’s cracking down on videos promoting abnormal and unhealthy eating after a long campaign by disordered eating support groups, politicians and regulators.
More than a billion people worldwide have downloaded TikTok, with the platform having long faced criticism for its laissez-faire approach to content regulation, especially given the youth of its userbase: TikTok is most popular among adolescent and pre-adolescent girls.
Among the most widely condemned content on TikTok are videos promoting unhealthy eating. While videos which outright promote eating disorders such as anorexia and bulimia have already been banned, others more subtly encouraging disordered eating practices have slipped through the net. This includes the What I Eat In A Day trend, which campaigners believe has incentivised dangerously low-calorie diets and a culture of shame.
An official statement from TikTok reads: “We’re making this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behaviour without having an eating disorder diagnosis. Our aim is to acknowledge more symptoms, such as over-exercise or short-term fasting, that are frequently under-recognised signs of a potential problem.”
The statement emphasises that TikTok uses “a combination of technology and people to identify and remove violations of our Community Guidelines, and we will continue training our automated systems and safety teams to uphold our policies.”
An official transparency report published on TikTok shows the number of videos which have been removed from the site due to contravening regulations. Just over 89 million videos were taken down in the last six months of 2020; this figure rose dramatically to almost 235 million between January and September 2021, with over 91 million taken down between July and September alone. (These figures do not include the final quarter of the year.)
Of the 91 million videos taken down by TikTok in the third quarter of 2021, 88% were flagged and removed before being seen by viewers. More than half of these were removed due to the material being potentially harmful to minors, whilst other red flag areas (most of which can overlap) include any illegal activity, sex, nudity, violence, self-harm, and promotion of hateful attitudes.
TikTok is taking this action after representatives of the platform were brought before the US senate in October 2021 to face questioning over the kind of content it allows, and the potential harm to its young users. Particular concerns were raised over the prevalence of content which appears to promote eating disorders to impressionable, vulnerable children.
As well as taking action to protect users from such potentially harmful content, TikTok has also said that it will change its community guidelines in order to make it clearer to those posting videos as to what constitutes unacceptable material. These guidelines already prohibit hate speech, including such anti-trans behaviour as deadnaming and misgendering, as well as the promotion of conversion therapies.
On top of this, TikTok also claims that new technologies are being developed to help protect the site’s younger users from content which might cause harm, by identifying potentially problematic videos and keeping younger users from being able to see them in the first place.