“Filter bubbles” or who’s holding your hand?
Facebook only displaying posts from your friends you interact with most? Google Search displaying different results for you and your mate in the same class?
Personalized search might be narrowing our world-view says pioneering online organizer Eli Pariser. He argues that as web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: we get trapped in a “filter bubble” and don’t get exposed to information that could challenge or broaden our worldview. Mr Pariser argues that this will ultimately prove to be bad for us and bad for democracy.
I think Mr Pariser’s absolutely right. Without any warning Facebook has started behaving very much like this and Google Search also works in a way that the results we get “make us happy”: often we only get to see what “we want” to see. “We want” equals to what the major online players think we want. So, essentially Google, Facebook, Microsoft, et al. can decide – to a certain extent – what is and what isn’t easily accessible for us. I have to agree with Eli Pariser that the consequences of this for an individual’s personal development is mostly negative. Similarly to how interpersonal debating works the Internet should let us get to know any kind of information that might upset us, make us sad, make us realise that we are wrong about something, or just be completely irrelevant to what we originally “wanted ” to see or read – the long-term experience will make us better developed intellectuals.
However, if we create a constant, linear online experience, the information on the Internet will just become dull and most users will end up being continuously brainwashed (unconsciously – of course!). And I’m sure this is not what we want. We DO NOT want this 21st-century-style censorship applied over OUR Internet.
"Filter bubbles" or who's holding your hand?,