In the first quarter of 2021, the video sharing site took down almost 7.3 million profiles, with children aged 13 and over allowed to use the platform.
In a Community Guidelines Enforcement Report, TikTok aimed to "help the industry push forward when it comes to transparency and accountability around user safety".
The report also revealed that 61.95 million videos were taken down for violating the site's rules, which was fewer than 1% of all uploads.
OF those, 82% were removed before being viewed, while 91% disappeared before any user reports, and 93% got taken down within 24 hours of them being posted.
Meanwhile, over 1.9 million ads got rejected for violating the site's policies and guidelines.
Furthermore, over 11 million accounts were removed for violating guidelines and terms of service.
The site's head of safety Cormac Keenan said: "To bring more visibility to the actions we take to protect minors, in this report we added the number of accounts removed for potentially belonging to an under-age person.
“In order to continue strengthening our approach to keeping TikTok a place for people 13 and over, we aim to explore new technologies to help with the industry-wide challenge of age assurance.”
Recently, TikTok introduced new measures to protect teenagers, including private messaging and live-streaming being limited to those 16 and over, while users under 16 have their accounts automatically set to private.
BANG ShowBiz Tech