A leading psychologist who advises Meta on suicide prevention and self-harm has quit her role, accusing the tech giant of “turning a blind eye” to harmful content on Instagram, repeatedly ignoring expert advice and prioritising profit over lives.

  • Viking_Hippie@lemmy.world
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    edit-2
    8 months ago

    This has been a general trend at Facebook/Meta since employees first started becoming aware of the problems

    1: people are harmed or harm others through and because of the company

    2: one or more employees make the higher ups aware of the problem, oftentimes suggesting specific solutions that would dramatically decrease the frequency and severity of the problem.

    3: the media find out

    4: Zuckerberg himself or a spokesperson makes a statement to the press about how “Meta takes these things very seriously and are working on solutions”

    5: Zuckerberg vetoes every solution that would result in decreased engagement and therefore ad revenue

    6: employee(s) get replaced or resign

    7: back to 1: and repeat forever

    • We need to put this in the focus. Advertisiers decided to pull out of twitter because they don’t want to have their products associated with some nazi “the jews did 9/11” consipracies or so. Why do they want their products to be associated with “Debby, 15 years old killed herself because of self harm content on instagram”?