from Hacker News

nsfw.rest – Keep your platform safe of NSFW content, all for free

by lngzl on 8/15/21, 8:56 PM with 16 comments

  • by RicoElectrico on 8/15/21, 9:19 PM

    Can you consider giving a score instead of a binary decision? Or differentiate between, say, beach pics and true 18+ content.

    Edit: The accuracy leaves a lot to be desired - you can paste any images from clothing stores of models in dresses or tank tops and it will flag it as NSFW.

    The "AI" seems to replicate an Islamic fundamentalist - a woman in a burqa did pass as SFW ;)

    Google offers a hosted SafeSearch version [1] which has a lot more nuance: https://cloud.google.com/vision/docs/reference/rpc/google.cl...

  • by tentacleuno on 8/16/21, 1:03 AM

    It's a good idea in theory, but what if it shuts down? And what about the privacy implications of sending all images to some server across the world?
  • by peanut_worm on 8/16/21, 1:48 AM

    It marked 3/4 SFW images as NSFW and then marked the only NSFW image I uploaded as being OK.

    I am not sure this thing is very accurate at all.

  • by nextaccountic on 8/16/21, 1:43 AM

    What are the API limits? That is, how many images scanned is too many?
  • by cucumb3rrelish on 8/16/21, 2:24 AM

    Amazing accuracy, I tried to trick it but it got everything right