Meta Platforms Inc., the parent company of Facebook and Instagram, is facing criticism for reportedly allowing ads that promote illegal drug sales on its platforms.
Despite policies prohibiting such content, a recent report by The Wall Street Journal revealed that Meta has been profiting from ads that openly advertise illicit substances, including cocaine and prescription opioids.
These ads, often displaying images of prescription drug bottles, piles of pills, or even bricks of cocaine, have been appearing on Facebook and Instagram.
One particularly alarming ad from July featured a photo of a razorblade and yellow powder arranged to spell out “DMT,” a known psychedelic drug, encouraging users to “Place your orders.”
Meta’s artificial intelligence tools, designed to moderate content, have struggled to filter out these drug-related ads. The ads cleverly use photos to evade detection, leading users to private group chats on the app Telegram, which is not owned by Meta, to complete transactions.
Once identified, Meta removed many of these ads within 48 hours and banned the accounts responsible for creating them. However, the incident raises serious questions about the effectiveness of Meta’s content moderation systems and its responsibility in preventing illegal activity on its platforms.
Lawmakers continue to debate the accountability of technology companies for third-party content, particularly in light of Section 230 of the Communications Decency Act, which generally shields online platforms from liability.
This incident is not the first time Meta has faced scrutiny over the sale of illegal drugs on its platforms; similar issues were flagged earlier this year, prompting an investigation by U.S. prosecutors.
As this story unfolds, it highlights ongoing concerns about the regulation of online content and the responsibility of tech giants like Meta to ensure their platforms are not used for illegal activities.