cross-posted from: https://beehaw.org/post/6795142

Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • poVoq@slrpnk.net
    link
    fedilink
    arrow-up
    49
    ·
    1 year ago

    Very sensationalist head line.

    If you read the paper, it is mostly that one well known Japanese instance that according to Japanese laws is mostly legal.

  • density@kbin.social
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    In just two days, researchers found 112 instances of known CSAM across 325,000 posts

    “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,”

    In the whole history of this group they have found less than 112 pieces of CSAM? It’s Stanford University. Why not drop in on a few of Jeffery Epstein’s friends and fans. They can tell you were to look.

    • emeralddawn45@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      Yeah literally. What a propaganda piece. Now do twitter, or Facebook, or Instagram. Except due to the walled garden effect of those platforms, the dangerous material probably isn’t viewable by just anyone. That doesn’t mean it’s not there though.

      • Or Reddit. You know, the website where a community dedicated to sharing CSAM was one of the biggest on the site and its lead moderator was a sitewide celebrity (oh, and Reddit’s current top admin was also a moderator on that community).

      • Quik@infosec.pub
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I don’t think it’s a propaganda piece as it’s even bringing up ideas on how to do moderation better in the Fediverse, it seems to me to be a bit too constructive to just call it propaganda and move on.

  • 0x1C3B00DA@kbin.social
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    I have argued for a while that the Fediverse is way behind in this area; part of this lack of tooling and reliance on user reports, but part is architectural. CSAM-scanning systems work one of two ways: hosted like PhotoDNA, or privately distributed hash databases. The former is a problem because all servers hitting PhotoDNA at once for the same images doesn’t scale. The latter is a problem because widely distributed hash databases allow for crafting evasions or collisions.

    -- https://hachyderm.io/@det/110769474386499134

    This is from the study’s author (here’s the full thread). It shows how pernicious centralization is in technology. The author is claiming the fediverse is “behind” instead of the tools behind behind in supporting decentralized services. They were developed with only centralized Silicon Valley silos in mind and now they can’t keep up with a decentralized infrastructure and the authors solution is for decentralized services to centralize around these tools.

  • macniel@feddit.de
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    1 year ago

    Going by the blurb posted, not the link. How are the demanding more robust moderation and reporting tools when obviously reporting something even took down the instance in question?

    Who was the sponsor of this research, Zuck and Musk?

  • drdiddlybadger@pawb.social
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    Isn’t this bound to happen without built in automated tools for flagging and moderation. Not quite sure how the federation handles this sort of thing besides community modding, saying something if you see something.

    • debounced@kbin.run
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      yep, i use Cloudflare’s CSAM tool to help aid in this… scans all object storage and cached items against known CSAM hashes. i don’t think most people hosting instances consider this as a massive liability if it’s open to the web for all to see… the feds (only talking about USA here) will shut you down or worse threaten charges if nothing is done about it.

    • Efwis@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      The problem is most of the people that see this crap want to see it. They take perverts to a whole new level, sick bastards 🤮🤮🤮

  • sciawp
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    1 year ago

    This is something I have worried about for a while. The core concept of the fediverse makes stuff like this really easy to do and there’s not really a solution. I guess government agencies just need to be on the lookout for it?

    • 👁️👄👁️
      link
      fedilink
      arrow-up
      12
      ·
      1 year ago

      There isn’t except going for server owners or infiltrating groups like they already do. Same can be said about encryption or even TCP protocol is being used to distribute CP. You simply can’t add CSAM protection for everything, nor should you. That leads to a surveillance state.

      It’s an unfortunate byproduct of privacy that it will be used for evil, but that doesn’t mean we get rid of security. That’s how fascism begins.

      • sciawp
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I was literally just thinking about this and I think you’re right

        • 👁️👄👁️
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yeah and this is one of the strong arguments used by governments to banned encryption as well. Or try to implement backdoors which are fundamentally broken as a concept and just straight up breaks encryption. Obviously we all know that’s bad. We also have to recognize these privacy and security things we advocate do in fact enable criminals. Still, that’s okay as they can be handled by other methods, and the right to privacy should be more important.

          You could make the same argument with guns and cars, they can be extremely powerful weapons for bad actors. That does happen to, we have mass shootings and all sorts of crime with cars. But despite that, we still allow everyone to have these things. Well with guns, that’s more debatable, but you get what I mean lol.

      • sciawp
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Of course! I’m more referring to instances specially made for this kind of stuff. It’s so easy for anyone to boot up their own server, but I guess places like discord And telegram have similar issues

  • candle_lighter@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    Fortunately it’s all on Japanese instances that many instances like Mastodon.social defederate from

  • vintprox@geddit.social
    link
    fedilink
    arrow-up
    4
    arrow-down
    3
    ·
    1 year ago

    Mastodon, an alternative social network to Twitter

    Not reading what’s next. Probably, some bull.