by martingordon on 10/3/20, 2:41 AM with 121 comments
by adim86 on 10/3/20, 6:07 AM
This whole Press release is about the "Steps" they have taken, very few of them tell us about these steps in detail or how we can verify them. There is no pledge to do better or a consequence to not meeting these goals. If feels to me that FB is mega coorp where the people in it want to do better but the machine is so much bigger than the people in it they are currently lost on how to do it without torching the whole company
by jtwaleson on 10/3/20, 6:24 AM
I'm still on HN, LinkedIn and WhatsApp. Every time I open LinkedIn I'm shocked at how addictive the feed is. I go there to message someone and before you know it 10 minutes have gone by and I forgot what I went to do in the first place. WhatsApp is really great, except that is owned by FB and they still extract value from me.
In 2015 I turned off all notifications on my phone. Quiting FB and no notifications have really improved my state of mind.
Overall, I'm pretty happy with how I use social media, but I'm very worried about how my kids will be able to handle it in their teens.
by joshribakoff on 10/3/20, 5:18 AM
by godelski on 10/3/20, 7:00 AM
I wonder what happened in 2018
> [Cambridge Analytica] closed operations in 2018 in the course of the Facebook–Cambridge Analytica data scandal, although related firms still exist.[0]
I'm sorry, but if you're going to lead with a policy change that was made after pretty much the largest scandal in your company's history, I think you know you're doing something wrong. If you want to claim the moral high ground changes need to happen before they become scandals. It is okay to make mistakes. It is okay to fuck up big time. But this just feels disingenuous. The reason people like The Social Dilemma is because it is real people saying "sorry, we fucked up. We take the blame, but let's solve the problem." This response feels like a childish response of "We didn't fuck up, you did."
> We don't sell your information to anyone.
IIRC the movie didn't claim this. Most people criticizing FB aren't claiming this. FB is selling access to the data.
> just like any dating app, Amazon, Uber, and countless other consumer-facing apps
I'm reminded of my mom saying "If all your friends jumped off a cliff would you?" I can't be the only one. Just because Netflix is guilty of similar crimes doesn't mean you're in the right. No one respects this defense.
This response is weak and does not feel genuine.
by martingordon on 10/3/20, 5:28 AM
50M hours / 1.5B DAU is about .03 hours, or about <2 minutes per person. The average person spends 1h15m on FB, this is less than a 3% drop in overall time, but likely a larger proportion of that time is spent scrolling and seeing more ads than it is fixed on a single video.
by imheretolearn on 10/3/20, 5:30 AM
How are you providing value to people when you show articles from the same perspective to the same person. I would value a product, if it gave me different perspectives on the same issue. I guess their definition of value is distorted. And therein, lies the problem and the reason for filter bubbles.
Try logging out of facebook and then login after a week. You will see a significant increase in the rate of notifications. How is this not driving usage?
>> We provide advertisers with reports about the kinds of people who are seeing their ads and how their ads are performing
Well, technically you are not the product. But when you combine hundreds of data points from millions of people and give access to advertisers to that data, you are a part of the product. So I guess, it should say, "You are a tiny part of the product". There, I fixed it.
>> Facebook uses algorithms to improve the experience for people using our apps—just like any dating app, Amazon, Uber, and countless other consumer-facing apps that people interact with every day. That also includes Netflix, which uses an algorithm to determine who it thinks should watch ‘The Social Dilemma’ film, and then recommends it to them.
The key difference between Facebook and other services is that facebook is a "social network". Things I post on facebook are viewed by my friends, family, colleagues which has an impact on how others perceive me and my social status. Facebook has the potential to literally shape my perception in public and my relationships. They completely fail to address this. I definitely don't get to choose the articles spewed by their algorithm and the articles I read/are shown to me definitely influence my thinking.
>> The overwhelming majority of content that people see on Facebook is not polarizing or even political
True, but there are certain topics which are "hot" topics. People usually have strong opinions on topics like religion, politics, sexual orientation etc. I wouldn't care that my friend is a cat person but I am a dog person. However, it would matter to me if my friend supports a candidate that I vehemently oppose. People usually lose their senses when it comes to the "hot" topics. So a post on these topics has a disproportionate amount of effect than a post on a vacation my friend is taking.
by stephenhuey on 10/3/20, 6:16 AM
by ignoramous on 10/3/20, 4:57 AM
by remarkEon on 10/3/20, 5:51 AM
I thought about writing a point-by-point rebuttable to this, about the obvious lies and wilful misdirection ...
But then I realized that I just don't care anymore. I love being free of this company and I almost got sucked back in to writing flame war comments about [thing that gets more attention].
Folks, just give this company up. You'll be much happier, I promise.
by skim_milk on 10/3/20, 5:44 AM
Ha, I just finished watching the movie an hour ago with a friend. Of course its obvious how the debate was going to be framed from the beginning: Facebook is supposed to be "unbiased" and from this point of view political inaction is the high-minded route over traditional low-minded autocratic rule. Instead of possibly curating a high-quality news feed for users to consume, users get to pick what they want to listen to because to force a user to watch curated news feeds would be biased against the news organizations that operate and compete for engagement on it. So Facebook does have smart leadership and have been proving themselves to more resemble an unbiased company compared than an autocratic government with some fake news initiatives.
However, our individual news feeds are not unbiased. By letting individuals control what they see, increases their bias. From this perspective, Facebook is now inherently biased towards extremism (our sad reality proves this point)
The way out of this situation is confusing and neither Facebook nor the movie talk of any real action. How about, instead of trying to clutch our unbiased pearls, we collectively learn to appreciate and understand inherent biases in everything because everything involving humans and our psyche is inherently biased and there is no end objective truth to the big issues surrounding our differences in values.
Now the question becomes not of bias vs facts, but of better and worse bias. If we could create a Facebook news stream that is inherently biased towards bringing people together rather than splitting us apart, it rather appears to solve this new question quite cleanly and as a plus fixes our pathetic situation. This isn't a radical new idea created in my head 2 hours ago, its a rather established view on this important issue: https://youtu.be/ZbPt66TYsFM
The big question I think should be how do we design this "good" bias into our social media and how do we convince everyone everything is biased literally no point in finding unbiased sources of anything to reframe the debate to be on good versus bad bias.
by corobo on 10/3/20, 11:50 AM
Am gonna watch this Social Dilemma myself now cheers for vouching for it FB
by throwawaysea on 10/3/20, 5:47 AM
> This extension is completely written in Javascript, and running completely within users Chrome browsers. All the source code can be viewed with Chrome Developer Tools. The code has been analyzed many times by many other developers. The extension has been used by nearly 200,000 users. If there is anything bad, it would have already been reported. You can feel absolutely safe to use it.
Does anyone know if this is indeed a safe tool to recommend to friends and family looking to thin their social media presence?
Are there equivalents for other social media platforms like TikTok, Twitter, Reddit, Instagram, etc?
by mmww on 10/3/20, 5:11 AM
`unless you give us permission`
lol
by Meph504 on 10/3/20, 6:15 PM
"The overwhelming majority of content that people see on Facebook is not polarizing or even political—it’s everyday content from people’s friends and family. "
But in my case, and many of the people I know, you can't not even open facebook without seeing some highly charged political bullshit. I've largely abandoned the platform, as it seems all it is, is an echo chamber of people I know of all political leanings posting shit that strongly, and often inaccurately supports their political viewpoint.
Its no longer a platform of people causally sharing their personal lives, its seeming an endless stream of worthless shares and repost.
I really want to see a social network that limits the ability to repost and "share" content like this, maybe go so far as to even do copy and paste content checks to make sure its not just shitpost copy pasta.
by adamlangsner on 10/3/20, 1:28 PM
by elmomle on 10/3/20, 7:06 AM
Instead, the answer was to trot out some absolutely inexcusable, banal platitudes. And now we get to see similar effects roiling the USA and seeping more and more into the world at large. The fact that they're continuing to tread out such platitudes this far in is simply indescribable. I understand the psychological foibles that can lead very smart people to deny the monstrosity of their own creations, but I am nevertheless filled with fathomless rage that Mark Zuckerberg prioritized his ego and his shareholders above the good of all mankind, whether that was conscious or driven simply by his unwillingness to accept the true nature of his creation.
by tossthere on 10/3/20, 6:26 AM
Directly responding to the film by name is such an obviously terrible idea PR-wise, I’m shocked to even be seeing this from Facebook. It must be an accident.
by KoenDG on 10/3/20, 4:30 PM
So literally like any product saying "We of product X recommend product X".
And also the fact that if they didn't put out this document, they might later get sued and the lack of response towards "The Social Dillema" may be used as an argument that they don't care. And as such, they lose a case.
But about that data...
See, I've always heard Facebook makes the largest part of its money by selling data about its users.
Is that not true? Or, even worse, was it wrong of me to assign value to articles and news stories that stated this? If so, why?
What about the reverse? Would there be those who consider it foolish to place stock in a statement that says Facebook does not sell user data, or sells it anonymized?
It feels to me like a deliberate attempt to move all conversation on the subject into the territory of "you can't really know for sure, so SHUT UP".
Get people to stop talking about it, by virtue of casting doubt on everything anyone says.
Meanwhile, Facebook keeps doing whatever it's doing. Nothing changes.
Facebook could fix this problem by opening up about what it does with the data. But it refuses to do that. Trade secrets, I suppose?
Either way, it's Facebook who's handling the data, not us. As such, all responsibility falls on them, not us.
by erling on 10/3/20, 6:25 AM
by rich_sasha on 10/3/20, 5:10 AM
I haven’t watched “Social Dillema”, I think it is odd to mix fiction and documentary, but just looking at the dementi, I don’t buy it.
by bot41 on 10/3/20, 12:14 PM
"The overwhelming majority of content that people see on Facebook is not polarizing or even political—it’s everyday content from people’s friends and family."
When I used FB I didn't find it polarizing or political but I did find I couldn't see content from people I knew. Other pages and crap overtook it and filled my feed. It was essentially useless to me.
I don't think FB is evil.. I think they are just about competent. Everything from them has been crappy. Remember when they spammed emails, chat was terrible, the apps were terrible, the notification icons didn't work on the website, FB messenger didn't work well.. etc, etc. Some things have been improved now but their talent is pretty lacking for a big tech company.
by dgellow on 10/3/20, 6:20 AM
by coffeefirst on 10/3/20, 2:52 PM
If they were really doing the best they can, the logical conclusion is no amount of effort can actually fix this, and a global, engagement-based social network is too toxic to exist.
by prepend on 10/3/20, 12:23 PM
If so, that’s a positive thing for them. A pdf doesn’t have the Facebook like tracking and shadow tracking. They can still see some traffic from the raw http gets, but it skips the ubiquitous Facebook analytics that the documentary talks about.
by mettamage on 10/3/20, 6:08 AM
This was my conclusion too. There were some genuine good parts, but most of it felt too hyped because the explanations were too flavorful and simplistic, or taken too far. It felt like I was watching The Secret or Zeitgeist.
> Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems.
This sentence lacks the nuance that they criticize. While I think it's a fair position to state it gives a distorted view (due to simplification, for example), that doesn't mean they're creating (or intending to) create a convenient scapegoat.
> The film’s creators do not include insights from those currently working at the companies or any experts that take a different view to the narrative put forward by the film.
Well current employees have a lot to lose in being honest, if they disagree. And the documentary did have experts that had a different view that was actually telling a different story than the narrative. I was actually on the lookout for it, because I was upset with the simplistic cookie cutter bullshit it was feeding us.
---- Reacting to the points ----
1. They say they're making efforts to make responsible use possible. The issue with this is, we can't check this. So reputation and trust is all we can go on. So I'm putting this in the neutral category.
2.
> We don't sell your information to anyone.
Wasn't Cambridge Analytica a thing? If so, then I don't care if you sell or don't sell, info gets out there. It feels disingenious to put that sentence in there as a response to "you are not the product".
I agree that the mantra is nonsensical "you are the product". Uhuh, right, because it's short and sweet it must be true? Nonsense, all explanations that I've ever read that were true never devolved to mantra-like statements. IMO it's a "code smell" that something is off.
...
I'll stop here, this is getting way too long.
Long story short: Facebook is being questionable here at best, despite that I agree in general with them that The Social Dilemma is nonsense.
The incentives here are all warped :/
by nyxtom on 10/3/20, 1:36 PM
by zxcb1 on 10/3/20, 6:06 AM
by sakisv on 10/3/20, 5:32 AM
Also, fuck you Facebook. You had multiple chances to redeem yourself and start being not-so-corrosive for the society and you intentionally went out of your way to make the wrong choice every single time.
by seriocomic on 10/3/20, 11:09 AM
by cratermoon on 10/3/20, 5:16 AM
by doomguy007 on 10/3/20, 9:35 AM
by aaron695 on 10/3/20, 5:31 AM
The Social Dilemma explains how the stupid 'you are the product' meme is incorrect, for instance.
It also explains the emergent behaviours are not planned, so equally they are hard to stop from planned actions.