by gw666 on 10/17/19, 5:39 AM with 2 comments
No matter how imperfect the implementation might be, it'd be great to have anything that exposes to the general public the idea that people need to consider the source of anything that's posted on the Internet.
What would you try doing?
by samkater on 10/18/19, 3:45 AM
Not sure if or how this solves an echo-chamber problem though.
Edit: just also had the thought to help tackle bias problems - the platform itself could produce biased content on either side of an issue from time-to-time to deduce people’s positions based on their votes. Then the reputation algorithm has a chance to adjust for ideas that polarize vs ones with general agreement and scale rankings accordingly.
by ben509 on 10/20/19, 2:29 AM
I'd add a system to let people identify aspects of the post, and then use reputation to verify that they're a good indicator of that aspect.
Liberals know what is "liberal" and conservatives know what's "conservative" quite reliably (in aggregate), even though those concepts are very fuzzy. And the problem most systems see is that politics being politics, those are frequently gamed. (If you've ever listend to C-SPAN's radio program, you know that half the "republican" callers are democrats...)
Then you have to determine which keywords to use. I think some basic guidance, "don't label a thing 'spam' unless it's someone selling crap" would go a long way to incentivizing a critical mass of users to label a thing honestly.
And then you let readers decide what they want to read.
My other notion (building on another comment[1]) is that discussion should include a team-based element. I think offering a way for small teams to come together and present ideas is more useful than individual commentary. A team allows less personal investment because you are now motivated by a desire from admiration from known peers.
But I've put away good number of beers, so I might just be rambling like a drunk idiot.