from Hacker News

The problem with Facebook

by loisaidasam on 3/22/18, 3:58 PM with 58 comments

  • by astalwick on 3/22/18, 4:10 PM

    https://twitter.com/fchollet/status/976783608219279360

    He goes on to defend his work at Google, arguing that they're similar on the surface, but Facebook is truly dangerous where Google is not.

    I'm not sure I agree with that. Google is quite a bit more distributed across products and platforms, so Facebook has a simpler loop centered around the newsfeed. That said, Google can track a user's behaviour across nearly every website on the internet.

    Facebook can run these "reinforcement learning on a global scale" experiments through its newsfeed. Google, it seems to me, can run them across the web as a whole.

  • by m_ke on 3/22/18, 4:17 PM

    He's ignoring the fact that YouTube is as bad if not worse than Facebook.

    They might demonetize some radical channels but they're still making money on users who get to YouTube through those channels.

    https://mobile.nytimes.com/2018/03/10/opinion/sunday/youtube...

    https://mobile.nytimes.com/2017/11/04/business/media/youtube...

    https://news.vice.com/en_us/article/d3w9ja/how-youtubes-algo...

  • by fortythirteen on 3/22/18, 4:28 PM

    The work Chollet is doing at Google is reaching an equally nefarious end. There are studies showing how manipulated search results have the same effects on perception, and YouTube is manipulating their feed in the same ways as Facebook.

    His analysis is correct, but this:

    > If you work in AI, please don't help them. Don't play their game. Don't participate in their research ecosystem. Please show some conscience

    is a clear cut case of the pot calling the kettle black.

  • by t3chn0SchO0lbus on 3/22/18, 4:15 PM

    The Twitter essay is my least favorite thing about the future we live in.
  • by minikites on 3/22/18, 4:23 PM

    https://twitter.com/fchollet/status/976784465245515776

    >Essentially nothing about the threat described applies to Google. Nor Amazon. Nor Apple.

    >It could apply to Twitter, in principle, but in practice it almost entirely doesn't.

    I don't believe this for one second. Google does the exact same "algorithmic curation" with its search results. Different people get different results based on internal profiles that Google has built: https://en.wikipedia.org/wiki/Filter_bubble. Over time that shift in search result content acts upon people in exactly the same way as the Facebook example he describes.

  • by throwaway84742 on 3/22/18, 4:38 PM

    A Google employee points out a problem with FB that’s also very much a problem with Google. That’s rich. Someone is about to receive a STFU email from HR.
  • by spdy on 3/22/18, 4:32 PM

    For the discussion we have to decouple that he works for Google.

    But he is right we are at a crossroad and the path that will be taken is clear for me. Manipulating/controlling populations is where the money will go and people will create those tools because those jobs pay well, its that simple.

    In the next elections we will see deepfakes videos of candidates instantly responding to problems or defaming videos will be put out were you cant judge on the spot if its real or not. The trend of echo chambers will continue as we see it right now.

    The only thing i can see is education if we look back at the recent history going from only a certain amount of people can who read/write or have access to books to everyone has to learn and has access to libraries. This is the next level.

    And on the other side we have to fight for our right of privacy and kill some business models on the way. Right now this is for me on the same scale as Atomic / Chemical Weapons.

  • by jphalimi on 3/22/18, 4:55 PM

    It saddens me that someone that brilliantly summarizing the problems of extensive AI-driven content organization in tech companies does not seem to understand that the company he works for suffer from the same exact problems.

    The real question I am having reading this thread is: is this guy being very naive, or just dishonest?

  • by mkrum on 3/22/18, 4:18 PM

    As someone mentioned in a comment section elsewhere, "It is very easy to sacrifice another person's job."
  • by dblotsky on 3/22/18, 4:46 PM

    I think the nefariousness is overblown. I present to you, a contender: an elementary school curriculum. No AI, and way more influence over basically everything you will hold as truth for decades.
  • by TACIXAT on 3/22/18, 5:18 PM

    The problem with social media is that it isn't social. Telling me which article to read isn't social. Showing me someone's status isn't social. Posting a tweet and getting 2 likes is not social. Commenting on HN is probably the closest thing to social because someone might actually interact with me.

    If you have all this data, make my life more fulfilling. I get more community out of IRC than I do on any of the major social media sites. They are really just media sites.

  • by antisocial on 3/22/18, 4:16 PM

    I agree with everything. Going by that logic, I think we should all celebrate that Google Plus is not as successful as Facebook. But for all your concerns, Mr.Chollet, what assurances can you give us about Google being not involved in something similar?
  • by panarky on 3/22/18, 5:13 PM

    We’re looking at a powerful entity that builds fine-grained psychological profiles of over two billion humans, that runs large-scale behavior manipulation experiments, and that aims at developing the best AI technology the world has ever seen. Personally, it really scares me

    If you work in AI, please don't help them. Don't play their game. Don't participate in their research ecosystem. Please show some conscience

    When Facebook does something awful, their defenders rush to say "what about Google, they're even worse!"

    There's a lot of false equivalence in HN discussions, but these two are not in the same galaxy when it comes to abusing the privacy of their users.

  • by kough on 3/22/18, 4:20 PM

    I can't take this genre of tech/opticon commentary seriously when they remove all human agency. Reading this argument, there's an implicit judgement that (1) humans have no choice but to be influenced by Facebook, and (2) other methods of information retrieval are somehow neutral. Sure, I agree that understanding power structures is important – what a novel and interesting point /s.
  • by artursapek on 3/22/18, 4:08 PM

    I can never take seriously someone explaining something in detail over a long series of tweets. What ever happened to personal websites for god's sake?
  • by 0majors on 3/22/18, 4:30 PM

    And Facebook is selling these capabilities to the higest bidder regardless of their moral or ethical standing.
  • by evc123 on 3/22/18, 4:26 PM

    Nah, fchollet just doesn't want pytorch to minimize keras:

    https://twitter.com/jekbradbury/status/976612114260357120