TikTok has been working on new methods to age-restrict specific sorts of material for many months as part of a larger drive to improve safety features for younger users. Earlier this year, the app introduced a new classification system called Material Levels to help it recognise more “adult” content.
The business has provided another update on its efforts. In a blog post, the business announces the release of a new version of its “borderline suggestive model,” which it uses to detect “sexually explicit, suggestive, or borderline material.” According to a TikTok spokesman, the new algorithm is better equipped to identify “borderline material,” which are films that do not expressly violate the app’s guidelines but may be inappropriate for younger users.
TikTok isn’t the only site that removes inappropriate material from recommendations. Instagram has long worked to remove questionable material from its recommendations. However, material with more “adult” themes but no explicit nudity has historically proven more challenging for automated algorithms to identify consistently. TikTok did not say how much more accurate the new method is, but it did say that in the previous 30 days it had “prevented underage accounts from accessing over 1 million blatantly sexually explicit videos.”
In addition, the app is now allowing artists to limit their movies to adult users. This functionality was previously limited to live videos, but it will soon be accessible for short-form clips as well.