Twitter is running ads on accounts promoting child sexual abuse

A sign outside a large concrete building at Twitter's headquarters reads @twitter with a blue twitter bird.

Twitter reportedly has a child pornography problem that goes beyond its ability to remove accounts that advertise child sexual abuse content.
photo: Justin Sullivan (Getty Images)

Twitter has struggled with mass peddling accounts, report shows child sexual abuse material on its platform, and now advertisers have declared they do not want their ads to be associated with a platform that cannot self-regulate for non-consensual material.

Reuters First reported Wednesday Major brands found Their ads appeared next to tweets soliciting child abuse material. The outlet said 30 advertisers, including The Walt Disney Company, Coca-Cola, Dyson, NBCUniversal and others, appeared next to the profile pages of Twitter accounts that actively sold and solicited child sexual abuse material.

Reuters reports that the names of about 30 brands have emerged in a study of online child sexual abuse by cybersecurity firm Ghost Data. Some of these advertisers, such as chemical company Ecolab, home tech maker Dyson, automaker Mazda and others, have pulled their ads from the platform, the report said.

a separate report Business Insider said Twitter on Wednesday began notifying advertisers that they found their ads were running on these types of profiles, apparently after Reuters shared their accounts with Twitter.Emails seen by Insider reportedly said the company had banned accounts for violating Twitter’s rules, ands Investigate how many people the ad may have reached.

“We are working closely with our customers and partners to investigate the situation and take appropriate steps to prevent this from happening in the future,” a Twitter spokesperson said in an emailed statement to Gizmodo, which apparently includes measures to detect and account suspension methods ‘s update. The spokesman added that they had suspended all profiles they found peddling child sexual abuse content, although they did not disclose how many accounts were involved.

The company does have a biennial Transparency Report. The company said it suspended nearly 600,000 accounts and moderated another 600,000 child sexual exploitation accounts from July through December 2021.

But despite these numbers, the question remains how much child sexual abuse is on Twitter?After all, Twitter is not like Pornhub, a site that gets knocked on the door Active profit Get rid of child abuse.The site struggles to make any concerted effort to mass child sexual abuse material, credit card companies gave up support for the website and Executive has left.

Reuters quoted Ghost Data as saying, 70% of the 500 accounts they found that dealt with child sexual abuse material were not deleted within 20 days in September. At least one of these accounts requested pornography for someone “13+”. These accounts advertise their content on Twitter, then move on to apps like Telegram or Discord for actual transactions, and share content using Mega and Dropbox.

According to recent reports, Twitter has known about such accounts for some time. edge Earlier this year, Twitter reportedly mulled over its own version of OnlyFans, essentially giving creators the option to create paid subscriptions for adult content. Closing the initiative, a dedicated team reported that Twitter had consistently failed to police “massive child sexual exploitation and non-consensual nudity.”

The report, based on internal documents and interviews with unnamed employees, noted that employees had been warning the company about child pornography for more than a year. Create such an operation of type OnlyFans cmay result in loss The same goes for advertising fees.

other sites such as Reddit also cited in the lawsuit Failed to police sexual content of minors, but Twitter made a lot of ad revenue.

Considering that 92% of Twitter’s revenue in 2021 depends on advertising, according to Application business data, Twitter may need a stronger ban to keep advertisers happy.

Source link