# Filter and Label Your Email ## Email Filtering Exercise ![](https://i.imgur.com/FWYqOKa.png) ![](https://i.imgur.com/O8jyH5t.png) -------- ## Reading Engagement There's a distinct irony in the rather contradictory analysis of the condition of present internet culture and its effects on democracy. The subject of analysis is a lack of consensus and "truth," while the academics discussing the matter seemingly don't share a common thesis either. The main author discussed in all three readings, Eli Pariser is firmly in the camp that filtering bubbles are real, and likely quite negative. Facebook's own findings (to the extent they can be trusted), as well as writers such as Jacob Weisberg, disagree with a number of Pariser's raised concerns. They suggest that Facebook doesn't hide as much opposing content as has been suggested, or that Google search results aren't as user-specific as Pariser found (in a very small-N study) (*Echo Chamber Revisited*). There's a certain contradiction at the heart of concerns about internet misinformation. On one hand, we worry about filter bubbles: the idea that people are only taking in like-minded information. On the other, there's a concern about the sheer quantity of information that an individual can come across, both factual and false. The problem there being that if there's too much to ascertain the veracity of, people will default to what they will agree with. But that "problem" fundamentally means that people are being exposed to other views, potentially even *over-exposed*. Clay Shirky hints at this idea, though more from the angle of aggravation, that in fact being online allows people to realize just how many people exist with opposing viewpoints, which actually causes them to double-down and become more polarized (*Echo Chamber Revisited*).