by chwolfe on 6/7/19, 12:13 PM with 47 comments
by 0X3jdRielGJRHSU on 6/7/19, 11:21 PM
Meanwhile this[0] was posted on some of the entrances to the common buildings on Microsoft's campus for some time. They seemed to be training an algorithm for "fairness" by taking video of employees that were entering via certain doors. You could "opt out" by choosing another entrance though.
by klodolph on 6/7/19, 5:07 PM
by la_barba on 6/7/19, 10:54 PM
A: "Because we don't have anyone to maintain it"
News Media: "Microsoft deletes database".
Does anyone feel like today's news articles, are written by gossip/tabloid writers?
by reitanqild on 6/7/19, 9:39 PM
> More recently, Microsoft rejected a request from police in California to use its face-spotting systems in body cameras and cars.
Sounds like they are actually starting to get some principles and are standing up for them.
by codemac on 6/7/19, 5:04 PM
Are there any citations or press releases of Microsoft saying it was deleted?
by cbhl on 6/7/19, 5:39 PM
If they only deleted the training data, but not the ML models generated from them, then you get the worst of both worlds (people still using the models to do things, and no way to validate or improve the fairness of said models by adding or removing labelled training data).
by deugtniet on 6/7/19, 5:53 PM
by sbr464 on 6/8/19, 2:37 AM
It's the Asian subset, 4 files, about 300 Gigs.
http://trillionpairs.deepglint.com/overview
addl info
by asymptotically2 on 6/9/19, 5:07 PM
Maybe some of the images in the database were scraped from LinkedIn or GitHub
by lohszvu on 6/7/19, 11:32 PM