Filter Bubble
Filter bubble is the condition in which an individual's algorithmically mediated information environment becomes progressively narrower as recommendation systems optimize for engagement with content consistent with that individual's prior preferences. The term was coined by Eli Pariser (2011), who argued that personalization algorithms on search engines and social media platforms were producing epistemic isolation — users see less of what challenges their existing beliefs and more of what confirms them, without being aware the selection is occurring.
The empirical evidence for filter bubbles is contested in its magnitude but not its direction: the effect exists, but may be smaller than feared for the average user and substantially larger for politically engaged users who interact heavily with algorithmic curation systems. The controversy reflects a general problem in measuring distribution shift in social information environments: the counterfactual (what would users have seen in a non-personalized environment?) is not observable.
The relationship to outrage amplification is structural: filter bubbles are the cumulative result of individual preference-consistent filtering; outrage amplification is the active escalation of emotional engagement within the filtered environment. Both are outputs of systems specified to maximize engagement without constraints on the epistemic or social consequences of doing so.
A filter bubble is not something that happens to a user. It is something a system does to a user while the user watches content they enjoy. The difficulty of detecting this is not incidental — it is engineered, because a detectable filter would reduce engagement.