# Assignment 3: Filter and label your email

## Response to readings
As you scroll through social media have you ever stopped to think why you see the posts that you do, or what someone else's newsfeed might look like in comparison to yours? In these three readings by Farnham Street, Eli Pariser and WNYC, these issues are brought up along with the technical terms of filter bubbles and echo chambers.
In the article by Farnham Street, the author simply explains what a filter bubble is and how they are related to echo chambers. A filter bubble is the information we are fed online as a result of an algorithm tracking what we click on. Filter bubbles can also be influenced by who we are friends with online, much like how our opinions can be altered by who we are friends with in real life. As a result of these filter bubbles, we become stuck in echo chambers. Echo chambers are when we believe the information and opinions we are seeing (which often align with our own) are the only ones that exist, forgetting that people will always have differing opinions.
Eli Pariser is the pioneer of the term filter bubble. What’s interesting is that in this article, he challenges his own beliefs, in a way, fighting against his own echo chamber. He looks at a study Facebook did, analysing the effects of filter bubbles on whether a user will see articles that oppose their political views. They found that the algorithm does have a small say in this, however, Pariser believes that it is still significant because without the algorithm the user is more likely to see and read those articles.
In a podcast published on WNYC, Jacob Weisberg talks about a study done by Eli Pariser. He got two people to search Google for BP during the oil spill in 2010. Their news results were completely different. Weisberg is skeptical of this study and Pariser’s claims about filter bubbles because it is not based on empirical evidence. I can see where Weisberg is coming from, however, I do think there is some truth in what Pariser is saying. A do it yourself version of Pariser’s study is to look at your Netflix homepage versus someone else’s. Netflix will tailor the preview image of each film/series based on what they think you like, resulting in yours being different to the other persons. This is purely based on your previous clicks on the website, just like what Google and social media sites do to put you in a filter bubble.