---
tags: ADS-S22
robots: noindex, nofollow
---
# Race against Technology: Technological Benevolence
## What is a technical fix? Why is it dangerous to frame electronic monitoring technologies as alternatives to incarceration?
[Matt Solone] - A technical fix is an automated or more technologically advanced approach to fixing some of our society's important issues. As the book stated, "Technical fixes operate in a world of euphemisms, where nothing is as it seems." It is a claim to fix something whereas its true nature is usually more brutal than intended. Electronic monitoring technologies (EM) are dangerous as it is almost setting up individuals for failure. When we let technology dictate what is wrong and right it first introduces the bias of the creator but also neglects certain situations in which the individual charged may be justified in using it.
## What are some of the potential benefits of using artificial intelligence in a hiring process? What are some of the likely harms? What were some of the reactions from applicants who interacted with AI technologies during their job search? How could algorithms be used to streamline discrimination?
[Brandon Trahms] AI can be benefitial for speed and automation. Some big companies will recieve thousands of applications and will struggle to sift through all of them. AI provides a solution that can be scaled and help even proactively find candidates. Harms can come from the misscalssifcications of an AI system where a good candidate is falsely classified as bad. This hurts both parties as the company may never see that person's resume and loss out on a portentially good candidate and the candidate may loss out on a job offer. However I think AI can work for and against you as a candidate. Where companies may employ AI to weed out resumes, AI search job searching technologies have been advancing as well to try and match people with potential job offers. All of this can be rather disheartening however, as it can feel as if your fate is in the hands of a machine rather than a real human who cna be reasoned with, afterall AI may not be as accurate as a human. Do to this lack of accuracy some AI systems may us demographics as stand ins for other characteristics and link things that are correlated rather than causal. This can unintentially make these streamlined systems discriminate not on the base of merit but instead on the base of background.
[Rica Rebusit] Faster process for both recruiters and candidates. Weeds out applications that aren't fit. Biased recruiting factors based on company previous employees (amazon against women). Although weeds out applications that aren’t fit, it can miss really well fitting candidates because of a missed key word, Lack of human judgment. According to one report, applicants are frustrated not only by the lack of human contact, but also because they have no idea how they are evaluated and why they are repeatedly rejected. feeling a heightened sense of worthlessness because “the company couldn’t even assign a person for a few minutes. The whole thing is becoming less human.
## What are some of the design flaws present in Diversity Inc.? How are zip codes used to assume racial-ethnic identities? How is this assumptive practice connected to a legacy of Jim Crow policies?
[Ethan] The design flaw that Benjamin brings up in regards to Diversity Inc is that they are targeting customers on the basis of race, among other things, without referring to it as race. Benjamin mentions that by using an individual’s first and last name, and then zip code where race can’t be inferred by last name, Diversity Inc has effectively coded for race. Benjamin also says that using zip codes as a proxy for race is connected to a legacy of Jim Crow policies as Jim Crow policies made it possible for that connection to be predictive, saying “racialized zip codes are the output of Jim Crow policies and the input of New Jim Code practices”.
## What is a racial fix? What is harmful about private companies creating racial-ethnic data to be sold to others? What are some of the potential harmful consequences of this information being available for sale?
[Faith Fatchen] Racial fixes often come packaged as diversity measures. They are sold as an acknowledgement of individuality. However, they focus racial-ethnic data, which leads to concerns about surveillance, privacy and discrimination towards certain groups. As discussed in the book, Diversity inc., a company collects data on racial-ethnic data is of particular us to companies who aren't allowed to collect this type of data due to civil rights legislation. As previously mentioned, this leads to concerns of surveillance, privacy and discrimination. For example, the medical device to measure lung capacity was caliberated differently by race making it harder for Black workers to demonstrate decreased lung capacity.
## What is technological benevolence? What happened when Janet Vertesi attempted to keep her pregnancy private from online platforms? Have you ever tried to withhold your private information when engaging with technology? If yes, what happened?
[Skip Moses] Technological benevolence is when a technology aims/claims to address a bias, but ultimately ends up reinforcing the new Jim Code. In order to keep her pregnancy private Janet had to use only cash and gift cards in order to purchase necessities for the pregnancy. In doing so, she set off a red flag as a potential criminal due to her financial habits. I have never so inclined to withhold private information. My father always paid to have our phone number excluded from the phone book, and I have watched him jump through ridiculous mental hoops to rationalize why some behaviors he will not do on the grounds of protecting his information. I always wondered who it was he thought he was protecting his information from.
[Joseph Shifman] Technology benevolence is the attempt to avoid human bias by using an "unbiased" algorithm, but actually deepens bias. The example the author provides is HireVue, a service I have used while applying to internships. Unfortunately, an algorithm such as this can still have biases within it. As Arvind Narayanan tweeted, "Human decision makers may be biased, bit at least there is a \*diversity\* of biases". When Janet Vertesi attempted to keep her pregnancy hidden, she was flagged as a potentially being a criminal by the stores she was buying gift cards from. She realized that some civic values are being undermined and there were no checks and balances. I always try to withhold as much information as possible when I engage with technology. Unfortunately, some websites make gigantic pop-ups that are hard to navigate if you want to turn off all cookies. I also check advertising services to make sure they are not tracking me, so as to not serve me targeted advertising. It still happens sometimes, but the major companies like Google have been serving me random ads.
## What are some examples of technologies being framed as neutral or benevolent tools that are being used to expand structural surveillance?
[Joshua Vong] - One example of technologies being framed neutral or benevolent are the electronic monitoring tools. Like the author Ruha Benjamin said in the book regarding electronic surveillance “like other forms of racial fixing, its function is to create vertical realities- surveillance and control for some, security and freedom for others.” People will always have concerns for surveillance, some will view it as another form of policing, being watched for every move that is taken, others will see it as a crime preventer, a safeguard to catch criminals. It all depends on the person’s view and experiences.
[Thomas Smale] - The Free Software Foundation is a movement for users to have the freedom to run, copy, distribute, study, change, and improve the software. It is free not as in price but as in liberty with what user’s can do with the software. This is different than the open source movement. The open source movement is more restrictive than the free software foundation. Some open source programs like Linux follow the GNU GPL (General Public License) version 2 instead of version 3. While version 2 allows everyone to see the source code, it can prevent people from modifying it and using it in their applications. Version 3 upholds the principle that anyone can change and distribute the software. An example of a software being framed as neutral is Ubuntu, one of the most widely used GNU/Linux operating systems in the world. It has a searching feature, much like finder on Mac which sends it’s searches to Canonical which sells this data to Amazon. Perhaps if Linux adopted version 3 of GNU GPL it would fix this problem.