A new report claims TikTok’s algorithm recommends pornography and sexualised clips to minors. Researchers created fake child accounts, enabled safety settings, and still received explicit search prompts. These included videos of simulated masturbation and pornographic sex. TikTok says it acted quickly after being informed and insists it remains committed to providing safe, age-appropriate experiences for children.
Fake accounts reveal hidden risks
In July and August, Global Witness researchers set up four TikTok profiles. They posed as 13-year-olds using false birth dates. The platform did not request additional identification. Investigators activated TikTok’s “restricted mode”. The company promotes this feature as a filter against sexual or mature content. Despite this, the accounts received sexualised search suggestions in the “you may like” section. These led to videos of women flashing underwear, exposing breasts, and simulating masturbation. At the most extreme, explicit pornography appeared hidden in seemingly harmless clips to bypass moderation.
Global Witness sounds alarm
Ava Lee from Global Witness described the findings as a “huge shock”. She said TikTok not only fails to protect children but actively recommends harmful content. Global Witness usually examines how technology affects human rights, democracy, and climate change. The group first discovered TikTok’s explicit material during unrelated research in April.
TikTok defends its safety features
Researchers reported their findings earlier this year. TikTok said it removed the flagged content and implemented fixes. But when Global Witness repeated the test in late July, sexual videos reappeared. TikTok says it offers over 50 safety tools for teenagers. It claims nine out of ten violating clips are removed before anyone views them. After the latest report, the company said it upgraded its search functions and removed additional harmful material.
New regulations heighten accountability
On 25 July, the Children’s Codes under the Online Safety Act came into force. Platforms must now enforce strong age verification and prevent children from accessing pornography. Algorithms must also block material linked to self-harm, suicide, or eating disorders. Global Witness carried out a second study after the rules began. Ava Lee urged regulators to act, saying children’s online safety must now be enforced.
Users question sudden sexual content
During the investigation, researchers observed user reactions. Some expressed confusion at the appearance of sexualised search suggestions. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”
