from Hacker News

Social-media platforms are destroying evidence of war crimes

by CPAhem on 9/26/20, 11:47 PM with 87 comments

  • by schoen on 9/27/20, 1:42 AM

    I was trying to reply to a post that got flagged, so I'll repost at the top level instead:

    I strongly disagree with [the other poster's lack of concern about war crimes], but I want to try to paraphrase the part of your intuition that makes sense to me.

    Facebook gets criticized by its users and governments both for taking down "too much" and "too little", sometimes with regard to the same post or subject area.

    Suppose user X posts something related to an armed conflict, violence, suffering, or death. This post might be part of a crime by X against Y, or evidence of a crime by Y against Z, or evidence of a crime by Y against Z, or not really a crime at all but just really disturbing and upsetting. Also, in various circumstances people might want records about violent crimes against them to be destroyed, or publicized, or not destroyed but not publicized (only used by some judicial process, or truth-and-reconciliation process, or historians, or something). Also, user X might have an intention that's different from the primary value or valence of the content, like prurient enthusiasm for violence, or making one of the parties depicted look bad, or trying to intimidate one of the parties depicted.

    In order to figure out which category (or categories) a post falls into, Facebook has to (1) learn the language(s) involved in all posts, (2) learn the political context of violent conflicts, (3) perform some level of adjudication and fact-finding about political conflicts and disputes, and maybe even (4) try to understand the motives of the person who posted something on each particular occasion.

    Is it reasonable to expect Facebook to do all those things, compared to other choices that are consistent, neutral, and inevitably result in various type I and type II errors with respect to the nature and purposes of posts related to violence?

  • by hirundo on 9/27/20, 12:16 AM

    “Publicity is justly commended as a remedy for social and industrial diseases. Sunlight is said to be the best of disinfectants; electric light the most efficient policeman.” -- Louis Brandeis

    Include pics on social media as a light form. Of course I don't want to be exposed to snuff films. But to the extent that blocking them sweeps evil under the rug that's a selfish impulse. And it's not just the existence of such evidence, such that blocking is fine if only it is preserved for a court. It is our very disgust and outrage that drives change. To protect the public from it is to abet such crimes. It's a good thing to protect children from it and a bad thing to treat us all as children.

  • by skissane on 9/27/20, 12:30 AM

    Removed from public access != deleted

    For all we know, the content still exists on the social media platforms' servers, and access is just blocked to the public, but the platforms will happily provide it to the proper authorities in response to an appropriate request.

    (Of course, I can't rule out the possibility they may have physically deleted some of it, or may do so eventually – I don't know what their data retention policies are – but I imagine they would be hesitant to permanently delete content that may plausibly be of future interest to regulators – this article contains zero information on whether they have actually physically deleted any of it or not)

    Human Rights Watch is of course a purely private body, so unless they reach some special arrangement with the platforms (and platforms are sometimes willing to enter into those sorts of special arrangements), they are not going to have any more access to removed content than the general public has. But that doesn't mean that government authorities don't have more access than the general public has.

  • by anigbrowl on 9/27/20, 12:30 AM

    I hoped the article would go into more depth. For example, propagandists store caches of videos and imagery on distributed servers to keep circulating it, and manually or automatically run imagery through filters that distort the image enough to fool computers but are still easily recognizable to humans. In some cases this is done to such a degree that it becomes a kind of in-group aesthetic, a style distinctive enough to be its own signature and obviate the need for attribution.
  • by neonate on 9/27/20, 12:45 AM

  • by malwarebytess on 9/27/20, 1:25 AM

    Censorship has innate negative nth order consequences. To fight temporary discomfort we destroy history.
  • by OmarShehata on 9/27/20, 12:07 AM

    This seems like a good thing. The article does explain how you probably shouldn't use what you see on Facebook as evidence in court.
  • by known on 9/27/20, 6:12 AM

    "If you don't read newspaper you are uninformed. If you do read a newspaper, you are misinformed" --Mark Twain (b. 1835) goo.gl/KXNf9
  • by mlb_hn on 9/27/20, 4:47 AM

    This has been a systemic issue reported on for years; e.g. reported by the Intercept in 2017 [1] and Atlantic in 2019 [2]. Not really made clear from the story considering the Economist headline is almost identical to the Atlantic one.

    [1]https://theintercept.com/2017/11/02/war-crimes-youtube-faceb...

    [2]https://www.theatlantic.com/ideas/archive/2019/05/facebook-a...

  • by aussieguy1234 on 9/27/20, 10:11 AM

    In alot of countries, it's illegal to tamper with evidence of a crime. Would this destruction of evidence count? https://en.m.wikipedia.org/wiki/Tampering_with_evidence
  • by tomc1985 on 9/27/20, 12:09 AM

    Yet another one of the myriad negative externalities that have come from the rise of cloud everything and people forgetting how to download their media.

    Own your bits, people.

  • by fogetti on 9/27/20, 2:51 AM

    Good. I mean we need more such articles, so that just like the BBC could shame the Cameroonian government into action, we can shame tech companies into action too.
  • by erikerikson on 9/27/20, 1:34 AM

    It wouldn't be hard for the feeds of these items to be fed to the appropriate governmental and non-governmental agencies, gated by some reasonable process.
  • by benatkin on 9/27/20, 3:42 AM

    Facebook has killed Rohingya. Mark Zuckerberg has blood on his hands.

    The U. S. has the ear of Facebook. Facebook listens to the E. U. but threatens to defy them. It largely ignores international human rights organizations though.

    > In June, The Gambia filed an application in U.S. federal court seeking information from Facebook that would help it hold Myanmar accountable at the International Court of Justice (ICJ).

    > Earlier this month, the company filed its opposition to The Gambia’s application. Facebook said the request is “extraordinarily broad,” as well as “unduly intrusive or burdensome.”

    https://time.com/5880118/myanmar-rohingya-genocide-facebook-...

    Edit: "Facebook staffer sends 'blood on my hands' memo" https://www.bbc.com/news/technology-54161344

  • by renewiltord on 9/27/20, 1:16 AM

    Does anyone actually give a fuck about this? Everyone wanted Facebook to be where people go to see baby photos. Now it is that thing. It's fine. I see lots of baby photos and my life is good.

    So go upload to Liveleak if you want to post pictures of people shooting hooded people. Bro I don't want to see that and previously I was okay at avoiding it on Facebook. Then everyone decided that it was important that Facebook remove anything political that could be spun. Now they're doing that. Great. Now you want to complain about that?

    Listen, I'm going to be honest. No one gives a fuck about war crimes beyond a "Oh no, this is horrible" thing. If you asked people to contribute a dollar to fighting them you'd get zero dollars.

  • by ReptileMan on 9/27/20, 7:58 AM

    Ugh. Activist organizations trying to make social media do their work for them. While keeping the donations for themselves.

    Facebook just can't win. They should just start ignoring such demands.

    Want evidence of war crimes preserved - then do it yourself.