The Filter Bubble
“It is an unfortunate fact that most people will tend to only see information which confirms their current beliefs”
Filter bubble
“Facebook was looking at which links I clicked on, and it was noticing that I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.” (Eli Pariser)
http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles.html
It is an unfortunate fact that most people will tend to only see information which confirms their current beliefs. Thanks to choosing friends with similar beliefs, choosing news programs which report things in a way which we find agreeable, and now thanks to the filter bubble concept, even Google and Facebook are selectively giving us more of what we have previously indicated we liked and clicked on.
This is a real problem for those of us who are interested in genuinely finding the truth in this sea of opinions. How do we inform ourselves completely when everywhere we look (whether by design or by accident) we only see more self-confirming bias? Perhaps more importantly, how do we reach everyone else who is trapped in their own bubble of self-confirmation, and don’t even realise it?
In an attempt to help with this problem we have recently launched an application which provides a surprisingly simple way out of this self-confirmation bubble for anyone who cares to look. It is called rbutr, and it simply allows people to connect one webpage which makes a claim, to another webpage which rebuts that claim. In doing so, any future visitors to the original claim webpage are then able to see that that page has been rebutted, and can easily click through to read the rebuttal.
Print article | This entry was posted by admin on May 7, 2012 at 7:32 am, and is filed under Biased information. Follow any responses to this post through RSS 2.0. Both comments and pings are currently closed. |
Comments are closed.
*/