# Filtering and Labeling Your Email

Filters on an email are just one example of the ways in which we limit the knowledge that we receive. Often times, we don’t realize which actions create the “filter bubble” we create for ourselves. Initially, I wasn’t aware of the labeling and filtering capabilities that gmail had. I’ve been an avid gmail proponent and user for years, and yet, this feature had never entered my mind. Yet, after one year of using the production software Logic Pro X, I had memorized hundreds of commands. I use gmail more often than a production software, yet I’ve never felt the need to utilize the organization and filtering techniques gmail offered. I didn’t realize that there was more to gmail than sending and receiving emails, which sounds quite ignorant looking back on it. But it’s not just email that I’m uninformed about, it’s the algorithmic process behind every website.
The “Filter Bubble,” a term coined by speaker and writer Eli Pariser, is the concept that the websites and sources we utilize will keep our feeds and the content on them consistent, as to avoid a variance of perspectives. Pariser is known for his TedTalk in 2011 on the dangers of the filter bubble, and called out companies (specifically Facebook) for being a proponent of this. Podcast “On the Media” outlines a similar phenomenon: the Echo Chamber. Echo Chambers happen as a result of filter bubbles, and refer to the action of surrounding ourselves with like minded individuals and opinions. Professor Cass Sunstein claims that the largest problem with these is “unjustified extremism.” It’s easy to believe in the extreme if it’s all you know.
So how do these relate? How are “echo chambers” and “filter bubbles” affecting something as simple as gmail? Well, it shows us how we can create our own. How we use our biases to filter something as simple as the emails we choose to see or prioritize. Filter bubbles are the same utilization to a much higher degree, the key difference being you’re not in control of the filter bubbles other websites put you in. Facebook’s algorithm will choose what’s at the top of your feed; you, unfortunately, do not get to make that decision, right? Wrong! There are ways of combating the bias. Personally, I search for political ideologies and explanations that are extremely different from my own. It messes up the algorithm because what they think I want to see is now different from what I actually want to see. Each individual on the internet can work against these, it’s just a matter of effort. The biggest problem is convincing people there is a problem. Likemindedness is comfortable, but the dissonance between differing opinions is often daunting, however, it is exercises like these that remind me of the necessity of a variance of opinions and sources.