Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • faeranne@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Another way to look at it is: How would you solve this problem with email?

    The reality is, there is no way to solve the problem of moderation across disparate servers without some unified point of contact. With any form of federation, your options are:

    1. close-source the protocol, api, and implementation and have the creator be the final arbiter, either by proxy of code, or by having a back door
    2. Have every instance agree to a singular set of rules/admins
    3. Don’t and just let the instances decide where to draw lines.

    The reality is, any federated system is gonna have these issues, and as long as the protocol is open, anyone can implement any instance on top of it they want. It would be wonderful to solve this issue “properly”, but it’s like dealing with encryption. You can’t force bad people to play by the rules, and any attempt to do so breaks the fundamental purpose of these systems.