A filter bubble is a result state in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behaviour and search history) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google’s personalised search results and Facebook’s personalised news stream. The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Pariser related an example in which one user searched Google for “BP” and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were “strikingly different.” The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal and addressable.
How to Burst the “Filter Bubble” that Protects Us from Opposing Views
Computer scientists have discovered a way to number-crunch an individual’s own preferences to recommend content from others with opposing views. The goal? To burst the “filter bubble” that surrounds us with people we like and content that we agree with.
Escape your Search Engine’s filter bubble
An illustrated guide by DuckDuckGo