Meta has come under the limelight yet again, and this time, it looks like the reason is its short content platform Reels. If you use Instagram Reels, you will know that the platform runs on an algorithm that studies the kind of Reels you like and interact with the most and soon enough, you start seeing more reels suited to your past activity.
But now, it seems that users who follow teenager influencers, are being served of reels that are showing child sexuality. This is deeply disturbing as most users who follow such influencers are mostly young users and seeing content like this is not only illegal and unethical, but also deeply disturbing to any age group.
To make matters worse, these disturbing reels are being paired with ads from reputed brands like Disney, Bumble, Pizza Hut, and even the Wall Street Journal. The Canadian Centre for Child Protection tested out this report and found that it was true and they were served up such reels.
In response to these reports and after verifying the same, Bumble, Match group, HIMS and Disney have pulled their ads from the Meta platform and this in turn, has triggered Meta to look into the matter as a priority. They have accepted that this is unacceptable and that they will be putting in the work to ensure that such eventualities do not repeat in the future.
What is even scarier is that apparently, the company knew that this could happen, even before the reels platform was launched, which means they have had all these year sot ensure this does not happen, but for some reason, it was not acted upon as the assumption was that user activity would not trigger this event. But, it looks like it has and now, they are on it with their best engineers.