by xufi on 10/21/16, 4:06 PM with 92 comments
by jacquesm on 10/21/16, 4:34 PM
Facebook needs to literally grow up, removing porn is one thing but clinical images, mothers feeding their children and such should not even be up for discussion.
It's a fine line between protecting your users from seeing offensive content and outright censorship, good to see them doing the right thing in this case, pity it is still on a 'case-by-case' basis instead of a healthy review of their policies.
The main criterion seems to be 'is the internet raising a large enough stink'? If yes then restore the image.
by alex- on 10/21/16, 5:10 PM
It appears that Facebook never argued that the image was in breach of its policies. Just that some software it runs had a bug that miss classified this image.
Then when challenged they apologized and approved the add.
So to me the summary appears to be "Software company has bug that effected one customer, apologies and fixes the issue" which must happen every hour of every day...
Am I missing something?....
by h4nkoslo on 10/21/16, 6:53 PM
I'm really tired of people engaging in pointless signaling campaigns and expecting to get points for being So Brave in the face of ~ universal consensus that they are correct, or taking minor bureaucratic snafus like this as evidence that they are somehow not in a position of complete victory.
by pyrophane on 10/21/16, 5:16 PM
by fnbr on 10/21/16, 5:27 PM
Facebook is trying to automate the detection of illegal/unwanted images, and it seems extremely difficult to detect the context of the image to the extent that you can differentiate between acceptable images of human bodies, and unacceptable images (which would be, I assume, the vast, vast majority of such images posted).
I wonder how they could proceed with this- maybe with some sort of anomaly detection, where you do a first pass to detect all images containing the unwanted features (e.g. naked bodies), and then a second pass to try and detect the activity that's going on, or to detect if the image is famous (e.g. a picture of David, the famous Italian statue, would be acceptable, while a photo of a naked man in the same position would presumably not be).
[1] http://www.siliconbeat.com/2016/09/12/sheryl-sandberg-respon...
by geff82 on 10/21/16, 4:55 PM
by striking on 10/21/16, 4:31 PM
But it bothers me that we leave so much of our discourse to such imperfect systems.
by tomcam on 10/21/16, 5:55 PM
by turblety on 10/21/16, 7:53 PM
by tn13 on 10/21/16, 5:12 PM
Have one single clear principle and apply is consistently. Change the principle don't make exceptions if needed. "Educational videos wont be removed" could have been a good policy that Google has had for Youtube.
Or even "No Breasts"can be a good policy too. If you want to show breast cancer videos do it on you tube, shoot it with a prop or link to another page. I dont see why that does not work.