The NYT and Canadian experts say Twitter is not doing enough to curb child exploitation

[ad_1]

According to a report from The New York Times, child sexual abuse imagery (CSAM) still persists on Twitter, despite Elon Musk stating that cracking down on child exploitation content is “priority #1” for the company.

While working with the Canadian Centre for Child Protection, which helped match abusive images to its CSAM database, the Times says it uncovered content across Twitter that was previously flagged as exploitative, as well as accounts saying they could sell more.

During its search, the Times says it found images containing 10 child abuse victims in 150 instances “across multiple accounts” on Twitter. Meanwhile, the Canadian Centre for Child Protection had similarly disturbing results, uncovering 260 of the “most explicit videos” in its database on Twitter, which garnered over 174,000 likes and 63,000 retweets in total.

Twitter reportedly promotes CSAM through its recommendation algorithm

According to the Times, Twitter actually promotes some of the images through its recommendation algorithm that surfaces suggested content for users. The platform reportedly only took down some of the content after the Canadian center notified the company.

Earlier this month, Twitter said it’s “proactively and severely limiting the reach” of CSAM content and that the platform will work to “remove the content and suspend the bad actor(s) involved.” The company claims it suspended around 404,000 accounts that “created, distributed, or engaged with this content,” a 112 percent increase since November.

“The volume [of CSAM] we’re able to find with a minimal amount of effort is quite significant,” Lloyd Richardson, the Canadian center’s technology director, tells the Times. “It shouldn’t be the job of external people to find this sort of content sitting on their system.”



[ad_2]

Source link