Instagram’s Reels video service is designed to show users streams of short videos on topics the system decides will interest them, such as sports, fashion or humor.
The Meta Platforms-owned social app does the same thing for users its algorithm decides might have a prurient interest in children, testing by The Wall Street Journal showed.
The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.
Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands.
[I’ve only been active on Instagram recently and don’t see a lot of weird content in the Reels, but I do find myself popular with accounts of barely-dressed women for some reason. I think it’s my scalp that turns them on. More seriously, it’s pretty clear that while Meta bars porn on its platforms, porn producers and honeypot scammers are very active on Instagram. This doesn’t surprise me, but it’s not clear what Meta will be able to do to tame it. — Ed]
Join the conversation as a VIP Member