970x125
TikTok has directed children’s accounts to pornographic content within a small number of clicks, according to a report by a campaign group.Global Witness set up fake accounts using a 13-year-old’s birth date and turned on the video app’s “restricted mode”, which limits exposure to “sexually suggestive” content.Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.The terms suggested under the “you may like” feature included “very very rude skimpy outfits” and “very rude babes” – and then escalated to terms such as “hardcore pawn [sic] clips”. For three of the accounts the sexualised searches were suggested immediately.After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex. Global Witness said the content attempted to evade moderation, usually by showing the clip within an innocuous picture or video. For one account the process took two clicks after logging on: one click on the search bar and then one on the suggested search.Global Witness, a climate organisation whose remit includes investigating big tech’s impact on human rights, said it conducted two batches of tests, with one set before the implementation of child protection rules under the UK’s Online Safety Act (OSA) on 25 July and another after.skip past newsletter promotionA weekly dive in to how technology is shaping our livesPrivacy Notice: Newsletters may contain information about charities, online ads, and content funded by outside parties. If you do not have an account, we will create a guest account for you on theguardian.com to send you this newsletter. You can complete full registration at any time. For more information about how we use your data see our Privacy Policy. We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply.after newsletter promotionIt added that two videos featured someone who appeared to be under 16 years old and had been sent to the Internet Watch Foundation, which monitors online child sexual abuse material.Global Witness claimed TikTok was in breach of the OSA, which requires tech companies to prevent children from encountering harmful content such as pornography.A spokesperson for Ofcom, the UK communications regulator charged with overseeing the act, said: “We appreciate the work behind this research and will review its findings.”Ofcom’s codes for adhering to the act state that tech companies that pose a medium or high risk of showing harmful content must “configure their algorithms to filter out harmful content from children’s feeds”. TikTok’s content guidelines ban pornographic content.TikTok said after being contacted by Global Witness it had removed the offending videos and made changes to its search recommendations.“As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies, and launch improvements to our search suggestion feature,” said a spokesperson.
970x125
970x125
