# Reading Responses Set 1 ### Jan 28 Fri - How the Web Works When waiting for a webpage to load, it takes about two seconds before I start getting impatient. However, after learning the countless players involved with making the Web work, I’m surprised it doesn’t take hours. MDN explains that for a site to load, the client- your computer, for instance, sends a request to a server- a computer that stores websites. When the server fulfills that request, the site appears in your browser. However, factors like internet connection or HTTP, which specifies the language clients and servers use, can complicate the process. Hartley Brody expands on this, detailing how and why websites use HTTPS, a secure version of HTTP. With sites that contain sensitive information like passwords, security is necessary to prevent hackers. Clients and servers use cryptography to encrypt the information and prevent data leaks. However, sites often do not use HTTPS as extra security means that more bandwidth and power is needed. With the many intermediaries involved with the web, I was curious as to what happens when technology like a server breaks. It made me think of when the **[largest data center in the Northeast, conveniently located in the Macy’s downtown, caught on fire](https://www.universalhub.com/2018/shorting-power-supplies-spark-fire-downtown)**. The outage resulted in the majority of MIT not having wi-fi for at least twelve hours. The U.S’s dependence on wi-fi and technology is ever growing, with wi-fi necessary for schooling and phones being linked to previously physical items like tickets. Are we consciously building technology and wi-fi into our lives so that it is essential? How can we ensure that servers never break? Who benefits from all of this? These questions go beyond the technical aspects of the readings, but I wonder if as society progresses, how or if the web’s structure will be exploited by those in power. ### Feb 4 Fri - Fake news Whether it’s a childhood friend or a distant relative, everyone knows someone who revealed themselves to have a controversial opinion through a fake news story on Facebook. But how do these stories, which often lack proof or blatantly lie, circulate? In the context of the 2016 election, Buzzfeed News discovered that the most viral stories came from newly created websites. Established news sources like the Washington Post received high engagement, but were still surpassed by these new and untrustworthy publications. While companies like Facebook and YouTube should change their algorithms so they stop fake news, Boyd argues that the issue is deeply embedded in American culture itself. Seeking knowledge is an act of independence, thus making us the authority: if we believe something is right, it is right. Yes, algorithms enable articles to become viral. But it takes people distrusting longstanding publications and believing in their media literacy to make them potent. Prior to Boyd’s article, I firmly had the stance that the burden of fake news lies on the companies. They need to be altering algorithms and removing content to protect their readers- it should not be on the audience to discern fake from honest news. However, culture is an underlying factor that propels media manipulation and its spread. We need to look at the root and add media literacy to education curriculums across the U.S. These curriculums could show students the correct way to evaluate sources with relevant examples, like the 2016 election or the vaccine rollout. We also need to consider how families and generations will continue to affect our beliefs about knowledge. My parents will constantly send me articles, some real and some fake, and perhaps because of our relationship, I always take the time to read and engage with them. I have a feeling that is the same for my peers, and thus increases engagement. What was most terrifying yet comforting was that the societal forces around fake news are nothing unfamiliar: education versus upbringing, individual versus societal beliefs, emotions versus facts. By looking to the past, we may create the new path forward. ### Feb 15 Tues - Cooperation While many of us have heard of the Prisoner’s Dilemma, a common social experiment, I never understood how the model’s inherent selfishness and competitiveness could lead to cooperation. Through discussing variations of the game that include punishments and rewards, Nowak argues that society operates best when we all cooperate. Punishment rarely helps even the punisher, and rewards lead to higher benefits all around. Nowak also discovered that reputation matters: if people can hold your actions accountable, you are more likely to “do the right thing.” The surefire way to win the Prisoner’s Dilemma is to be selfish, which Nowak says could be a form of natural selection: if it always benefits us to be selfish, then those who cooperate will slowly be edged out of society. But how can we cause people to shift from mindsets of selfishness to cooperation? Gossip may seem trivial, but Reagle argues that it is how larger social networks stay connected. However, it can be limited by Dunbar’s number, a theory that 150 is, “roughly, the cognitive limit of how many relationships humans can maintain given their complexity.” By gossiping and sharing information, this kind of cooperation may help execute Nowak's ideas. In past responses, I have argued that technology is not the problem- the issue is with how we disseminate and interpret news. Taking from Nowak’s theories, what if we gave everyone on the internet a label about their reputability? If the story you shared was "fake," you could be labeled as such This would encourage people to look for reputable sources and avoid hurting their reputation. However, this raises a few concerns- would users label as well? How do we know that Facebook would censor responsibly? Perhaps a third party like Reuters could help, but the issue of bias would always be present. What if we then tried to change culture through gossip? Dunbar’s number would prevent the amount of people we could effectively reach, but this could also create a domino effect as people gossip and spread. We would have to first identify hubs- people who have a high number of connections within a network. We could then ask them to share the information to their followers. Ideally, they would include people from different platforms and on all sides of the political spectrum. The hubs could share one of Boyd’s pieces on agnotology or media literacy, or maybe a new guide that covers far-left and far-right beliefs. While this idea would require cooperation on a grand scale, it could very well change how many read and interpret media. ### Feb 18 Fri - Social Networks While nodes and ties seem like strange vocabulary, they make up social networks, and explain core class concepts like how echo chambers are created and how fake news circulates. Rheingold argues that how homophilous networks (a group of those with similar beliefs) tend to read the same news, which makes them form similar opinions. This therefore "can limit the amount of information that people can get" (p. 206), leads to echo chambers. While fake news is popular in some homophilous networks, another concept, bridges, explains how fake news goes viral. When people in networks are not connected, that gap is called a structural hole. People who link these unconnected networks are bridges. If a bridge posts a fake news story and it spreads to their three networks, it probably only takes a few more bridges sharing story for it to go viral. Social networks may further disprove Eli Pariser’s belief that website algorithms are causing echo chambers. I believe homophilous networks explain echo chambers better than filter bubbles because friends are more likely to consistently share these similarly-minded articles rather than algorithms. Thinking about echo chambers in the context of social networks is key to dismantling them: we are more likely to naturally connect with different people rather than read news we actively disagree with. For instance, one of my closest friends is the chairman of their college's Republican group. Because of our friendship, they are more likely to change my political beliefs over any op-ed. From networks, we can also understand how to stop fake news from circulating. If we targeted bridges and asked them to share say, Boyd’s article on agnotology, we would make a bigger impact than just sharing it with our friends, who probably have similar thoughts. Figuring out who has different connections is key, and maybe social bonds, not technology, is the key to solving our problems. ### Feb 22 Tues - Haters Hate is an unconscious embed of the internet: I automatically scroll to a video's comments to see if it is disliked, my eyes glazing over any positivity. But why is it this way? Why do people viciously attack each other, specifically over the internet? Reagle argues that the anonymous nature of the internet can cause individuals to experience deindividuation, which is “a loss of a sense of self and social norms.” By not having a physical face to say things to, what we say can become more extreme. As proven through studies by Zimbardo and Nogami, people are more likely to punish, steal, or cheat when they cannot be traced for their bad behavior. Internet hate can sometimes feel overwhelming: how do you stop people who have proven that they have no limit? Reagle argues that fighting back or ignoring the trolls is not the answer. Instead, we should try to make cultural change by actively condemning this behavior and supporting the victims. When the reality show Big Brother airs, Twitter somehow becomes a more toxic platform. Because the show has 24/7 live feeds, users surveil contestants and use Twitter to hold them accountable for **[racist statements](https://twitter.com/risekemi/status/1146912538405298176?lang=en)** and even their bodily habits. Contestants are often insulted physically and emotionally, with some losing their jobs. While toxic, this environment is also incredibly addictive. When contestant Jackson Michie said racist remarks, all I wanted was **retribution**: he was a racist who deserved to be bullied. But, as discussed in the reading, the bully-battle is far from productive, and it never feels satisfying. I was also in a homophilous community: everyone around me felt the same way, and those who said something remotely positive about Jackson were quickly accused of supporting a racist. Odds are those people did not even know of Jackson's past. I never understood why a reality show could make me so angry, but it makes sense now that the echo chamber led to the extremism I experienced. Nowadays, I completely avoid Twitter when the show airs. Podcasters and past contestants always advocate for kindness and with their support, the culture is shifting, slowly but surely.