# Filtering and Fake News
## Filtering my Email
Going through my email every day has become a bigger and bigger chore as I have gotten to Northeastern and become involved in classes, clubs, and the university community. There are so many different emails that arrive in my inbox that it becomes so easy for things to get lost (which is something that happens to a concerning extent). After creating these rules to filter my email, I hope my inbox becomes easier to navigate and important content does not get lost as often as it does now. I filtered NU News, which hopefully will make my inbox lighter, and the Explore Program newsletter and emails, which will be helpful to find announcements for the Explore class that I am a TA for.

## Fake News and Filter Bubbles
The internet's vast interconnectedness, despite its great potential, is destroying society. As Claire Wardle says in Understanding Information Disorder, "our information system is now dangerously polluted and is dividing us rather than connecting us." She discusses how more and more disinformation has been spread on websites and social media, and private information has been collected for marketing. However, rather than political figures and credible journalists stopping this wave of toxicity, some political figures are encouraging it, while even credible journalists make the situation worse by giving attention to the fake news phenomenon.
This only becomes worse with filter bubbles. Filter bubbles are helpful tools for social media and news websites themselves - readers are more likely to engage with articles they find interesting, so the websites' algorithms will show them those articles to receive engagement and revenue. This creates a "personal ecosystem of information" (FS), in which users receive different information even when searching for the same thing. FS provided the example from Eli Pariser's blog, in which two users both searched for BP, but one was provided with stock information and another with news about an oil spill. If algorithms provide different information to different users, it can lead to division within society, as people will have different ideas of the truth.
Therefore, I believe that even if everyone had suitable media literacy skills, we would end up in filter bubbles. The algorithms that have led to the division of society are not within the general public's control, rather the control of corporate executives that care more about profit than the good of humanity. Danah Boyd points out the need for media literacy programs, citing a case in her research of a midwest teen deeply misinformed on pregnancy and STDs. Additionally, hearing everywhere that there is so much fake news can lead to a cultural sense of mistrust. Boyd emphasized how people are raised to trust themselves and trust their gut, meaning that if something feels wrong, they may not trust it. This is beneficial in many situations, specifically regarding our own personal safety, but in the end can lead to the question of what information can we trust, and which we cannot? In my opinion, this concept leads information to ultimately be subjective, to whether or not people trust the information or the source. I hope that one day society can be literate, people can come together to spread truth, and companies can prioritize morality over money, but for now, I do not believe that media literacy skills can save us from the plague of disinformation.