A judge has dismissed a complaint from a parent and guardian of a girl, now 15, who was sexually assaulted when she was 12 years old after Snapchat recommended that she connect with convicted sex offenders.

According to the court filing, the abuse that the girl, C.O., experienced on Snapchat happened soon after she signed up for the app in 2019. Through its “Quick Add” feature, Snapchat “directed her” to connect with “a registered sex offender using the profile name JASONMORGAN5660.” After a little more than a week on the app, C.O. was bombarded with inappropriate images and subjected to sextortion and threats before the adult user pressured her to meet up, then raped her. Cops arrested the adult user the next day, resulting in his incarceration, but his Snapchat account remained active for three years despite reports of harassment, the complaint alleged.

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    3
    ·
    9 months ago

    Bunch of things going on here.

    On the one hand, Snapchat shouldn’t be liable for users’ actions.

    On the other hand, Snapchat absolutely should be liable for its recommendation algorithms’ actions.

    On the third hand, the kid presumably lied to Snapchat in order to get an account in the first place.

    On the fourth hand, the kid’s parents fail at basic parenting in ways that have nothing to do with Snapchat: “If you get messages on-line that make you uncomfortable or are obviously wrong, show them to a trusted adult—it doesn’t have to be us.” “If you must meet someone you know on-line in person, do it in the most public place you can think of—mall food courts during lunch hour are good. You want to make sure that if you scream, lots of people will hear it.” “Don’t ever get into a car alone with someone you don’t know very well.”

    Solution: make suggestion algorithms opt-in only (if they’re useful, people will opt in). Don’t allow known underage individuals to opt in—restrict them to a human-curated “general feed” that’s the same for everyone not opted in if you feel the need to fill in the space in the interface. Get C.O. better parents.

    None of that will happen, of course.

    • makeasnek@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      9 months ago

      On the other hand, Snapchat absolutely should be liable for its recommendation algorithms’ actions.

      Should they though? The algorithm can be as simple as “show me the user with the most liked posts”. Even the best design algorithm is going to make suggestions that users connect with sex offenders because the algorithm has no idea who is a sex offender. Unless snapchat has received an abuse report of some kind of actively monitors all accounts all the time, they have no way to know this user is dangerous. Even if they did monitor the accounts, they won’t know the user is dangerous until they do something dangerous. Even if they are doing something dangerous, it may not be obvious from their messages and photos that they are doing something dangerous. An online predator asking a 12 year old to meet them somewhere looks an awful lot like a family member asking the same thing assuming there’s not something sexually suggestive in the message. And requiring that level of monitoring is extremely expensive and invasive. It means only big companies with teams of lawyers can run online social media services. You can say goodbye to fediverse in that case, along with any expectation of privacy you or anybody else can have online. And then, well, hello turnkey fascism to the next politician who gets in power and wants to stifle dissent.

      Kids being hurt is bad. We should work to build a society where it happens less often. We shouldn’t sacrifice free, private speech in exchange or relegate speech only to the biggest, most corporate, most surveilled platforms. Because kids will still get hurt, and we’ll just be here with that many fewer liberties. Let’s not forget that the US federal government has a list of known child sex offenders in the form of Epstein’s client list and yet none of them are in prison. I don’t believe that giving the government more control and surveillance over online speech is going to somehow solve this problem. In fact, it will make it harder to hold those rich, well-connected, child rapist fucks accountable because it will make dissent more dangerous to engage in.

      • nyan@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        9 months ago

        Yes, they should. They chose to deploy the algorithm rather than using a different algorithm, or a human-curated suggestion set, or nothing at all. It’s like a store offering one-per-purchase free bonus items while knowing a few of them are soaked in a contact poison that will make anyone who touches them sick. If your business uses a black box to serve clients, you are liable for the output of that black box, and if you can’t find a black box that doesn’t produce noxious output, then either don’t use one or put a human in the loop. Yes, that human will cost you money. That’s why my suggestion at the end was to use a single common feed, to reduce the labour. If they can’t get enough engagement from a single common feed to support the business, maybe the business should be allowed to die.

        The only leg Snapchat has to stand on here is the fact that “C.O.” was violating their TOS by opening an account when she was under the age of 13, and may well have claimed she was over 18 when she was setting up the account.

    • Pika@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      9 months ago

      I’m failing to see how it’s snapchats problem, it can’t know that the person was nefarious, and it’s not reasonable to expect that it should have been able to know. This is like saying that Disney should be held responsible because someone decided to go on a killing spree while using the recommended costume of the week. It’s two isolated events that happens to coencide with eachother.

      this is a failure on the parents side all the way down, from the lack of supervision to the allowance of making a social media account below the legal age to do so.

      • nyan@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        9 months ago

        Snapchat is not the only problem here, but it is a problem.

        If they can’t guarantee their recommendations are clean, they shouldn’t be offering recommendations. Even to adults. Let people find other accounts to connect to for themselves, or by consulting some third party’s curated list.

        If not offering recommendations destroys Snapchat’s business model, so be it. The world will continue on without them.

        It really is that simple.

        Using buggy code (because all nontrivial code is buggy) to offer recommendations only happens because these companies are cheap and lazy. They need to be forced to take responsibility where it’s appropriate. This does not mean that they should be liable for the identity of posters on their network or the content of individual posts—I agree that expecting them to control that is unrealistic—but all curation algorithms are created by them and are completely under their control. They can provide simple sorts based on data visible to all users, or leave things to spread externally by word of mouth. Anything beyond that should require human verification, because black box algorithms demonstrably do not make good choices.

        It’s the same thing as the recent Air Canada chatbot case: the company is responsible for errors made by its software, to about the same extent as it is responsible for errors made by its employees. If a human working for Snapchat had directed “C.O.” to the paedophile’s account, would you consider Snapchat to be liable (for hiring the kind of person who would do that, if nothing else)?

        • Pika@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          9 months ago

          No i would not, unless it was proven that said employee knew the person was an S.O and knew that the account was a minor (but at that point the employee should have disabled the account per Snapchats policy regardless). If that data was not available to them, then they wouldn’t have the capability to know so I would concider it not at fault.

          • nyan@lemmy.cafe
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            edit-2
            9 months ago

            Then, in my opinion, you would have failed to perform due diligence. Even if you’d thought C.O. was an adult, suggesting a woman strike up a private conversation with a man neither of you know is always something that deserves a second look (dating sites excepted), because the potential for harm is regrettably high.

      • Chaos@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        It isn’t, and the courts agreed with that. Seems like an issue with legislative law. As far as I was aware, sex offenders were suppose to have Internet restrictions…

        Could there be a good discussion to try and prevent harm to further children? Well, yeah. Some parents just suck and it’s the kid that gets hurt.

        As long as it doesn’t involve stuff like kosa which puts more people on harms way.