# Filter and label your email ## Outlook rule creation In addition to creating a rule to separate emails from 'News@Northeastern.edu,' I also created a rule to automatically move confirmation emails of my co-op applications into a folder. The confirmation emails clog up my inbox, and this keeps them organized and safe in case it's necessary that I refer to them later. ![rule creation outlook](https://i.imgur.com/4u7VOXW.png) ![co-op application rule](https://i.imgur.com/pIWqUnd.png) *** ## Filter bubbles The word “bubble” itself seems so harmless and friendly, and people often claim that you can never sound angry when saying it. Attach the word “filter” in front of it, and you have a tool to polarize citizens. The greatest dangers of the [“filter bubble”](https://fs.blog/2017/07/filter-bubbles/) extends beyond hindering our personal ability to make balanced decisions and displaying personalized advertisements as we browse the web; it is a **threat to democracy itself.** Identified by [Eli Pariser](https://en.wikipedia.org/wiki/Eli_Pariser), filter bubbles are created as a result of algorithms that dictate what we encounter online, creating a unique ecosystem of information for each individual and sheltering us from opposing perspectives. According to [University of Chicago Professor Cass Sunstein](https://en.wikipedia.org/wiki/Cass_Sunstein) in 2004, filter bubbles “make a situation where people demonize those who disagree with them.” The statement holds true over 15 years later, as demonstrated by the [most recent presidential election](https://time.com/5907318/polarization-2020-election/). So what’s so bad about surrounding yourself with people and opinions you agree with? Isn’t that our individual choice, some may ask. The personalized content that we see may seem to make things easier, alleviating the effort to dig deeper ourselves. Yet, we sacrifice privacy and individual choice for the appearance of ease and altering of our personal identity. Online platforms develop sensational, extremist material so you keep coming back, and thus not to other websites in which you might discover different perspectives and ideas. We close off our circle to include people who reinforce our beliefs, and anyone who disagrees is villainized. As a result, [echo chambers](https://en.wikipedia.org/wiki/Echo_chamber_(media)#:~:text=In%20discussions%20of%20news%20media,system%20and%20insulated%20from%20rebuttal.) are created, and we quickly believe our experience and beliefs to be of reality and can become unable to make rational, logical decisions. For example, if an individual were to solely read fake news articles that picked apart Hillary Clinton’s character and surrounded themselves with others who hated her, they would likely be among the many who believed she was running a child-trafficking ring through a pizza shop in the [2016 Pizzagate conspiracy theory](https://www.rollingstone.com/feature/anatomy-of-a-fake-news-scandal-125877/). I am curious about the degree to which such passiveness and acceptance of distortion may influence us neurologically and psychologically. But the trickiest part of escaping a filter bubble is that we don’t know that we are in one unless we take solid steps to leave it. ![echo chamber](https://www.thepacer.net/wp-content/uploads/2018/04/echo-chamber-297x420.png) And as much as we can take preventative measures to avoid falling into filter bubbles, such as by using ad-blocker extensions and prioritizing education over entertainment, individual agency may not be enough. I believe the **responsibility ultimately lies with the leaders behind popular media platforms,** such as [Facebook’s](facebook.com) Mark Zuckerberg, [Twitter’s](twitter.com0) Jack Dorsey, [Instagram’s](instagram.com) Kevin Systrom, and [Google’s](google.com) Sundar Pichai.