Adobe recently updated its terms of use, and although companies do this all the time, these new changes have sparked a significant amount of discord and discussion among users.

The updated terms of use give Adobe access to any form of media uploaded to its Creative Cloud and Document Cloud services, a change which immediately sparked a privacy backlash and led to many users calling for a boycott. So annoyed were paying customers that Adobe was forced to issue a statement clarifying what the updated terms mean, and what they cover.

The changes Adobe made include switching the wording from “we will only access, view or listen to your content in limited ways” to “we may access, view or listen to your content” and the addition of “through both automated and manual methods”. In the Content section, Adobe made changes to how it would scan user data, adding the manual mention.

In its explanation of the terms changes, Adobe said, “To be clear, Adobe requires a limited license to access content solely for the purpose of operating or improving the services and software and to enforce our terms and comply with law, such as to protect against abusive content.”

While the intentions behind these changes might be to enhance service quality and ensure compliance with legal standards, permitting the company to have such broad access to personal and potentially sensitive content clearly feels intrusive to many users. The shift from an explicit limitation to a more open-ended permission for content access could be seen as a step backward in terms of user control and data protection and raises concerns about privacy and user trust, which Adobe’s statement doesn’t fully address.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    11
    ·
    5 months ago

    I’m positive they got notified they were hosting a massive amount of CSAM, or similarly awful AI generated shit since it’s the Wild West out there now. This was their only way out.

    • sabin@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      5 months ago

      Sounds like a smokescreen to me. All file sharing services have this problem. The solution is to respond to subpoena requests and let the government do their jobs. They do not have to allow themselves to arbitrarily violate their users privacy in order to do that.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        24
        ·
        edit-2
        5 months ago

        No, they don’t. If you’re storing something that is found by a law enforcement agency, you are legally liable. That’s the difference.

        You can’t just say out loud “Hey users, please stop storing CSAM on our servers.” Not how that works.

        • sabin@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          1
          ·
          5 months ago

          Adobe is not a video distribution platform. They do not have this level of culpability.

            • sabin@lemmy.world
              link
              fedilink
              English
              arrow-up
              10
              ·
              5 months ago

              That’s not the same as content distribution.

              Sharing content to clients cannot be effectively done through creative cloud.

              It does not make sense to try and stop the distribution at the level of video editing. Not only is the thought of child predators making regular use of professional editing software completely absurd, but even if you assume they do, why the fuck do you think they would use the inbuilt cloud sharing tools to do so?? They would just encrypt the contents and transmit it over any other file sharing service…

              It makes no sense to implement this measure because it does absolutely nothing to impede criminals, but enables a company well known for egregious privacy violations unprecedented access to information completely law abiding clients have legitimate reasons to want to keep private.

              It is a farce. A smokescreen intended to encroach on customers precious data all the while doing nothing to assist law enforcement.

    • greenskye
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      5 months ago

      I realize it’s gross and icky and morally problematic, but I really wonder if trying to have the government crackdown on AI generated CSAM is worth the massive risk to freedom of speech and privacy that it seems like it’s going to open us up to. It’s a massive blank check to everyone to become a big brother.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        6
        ·
        5 months ago

        There are no laws about it anywhere right now, but I’m sure it’s about something more real. As this has played out many times in the past (Amazon, Apple, Google FB…etc) across many different policing agencies: if they identify a HUGE host that is a problem, they notify them first and allow them to address the issue before making themselves known and cracking down on the offenders. This just feels like that same thing again.

        AI or not, if a court can prosecute a case, they’ll do so.